论文标题

二阶在线非convex优化

Second-order Online Nonconvex Optimization

论文作者

Lesage-Landry, Antoine, Taylor, Joshua A., Shames, Iman

论文摘要

我们介绍了在线牛顿的方法,这是一种用于在线非convex优化的单步二阶方法。我们分析其性能,并获得动态遗憾结合,该结合是在圆形优势之间的累积变化中线性的。我们表明,如果圆形Optima之间的变化有限,则该方法会导致持续的遗憾。在一般情况下,在线牛顿的方法优于凸功能的在线凸优化算法,并且与专用算法相似,以实现强烈凸功能。我们在非线性,非convex移动目标本地化示例上模拟了在线牛顿方法的性能,并发现它的表现优于一阶方法。

We present the online Newton's method, a single-step second-order method for online nonconvex optimization. We analyze its performance and obtain a dynamic regret bound that is linear in the cumulative variation between round optima. We show that if the variation between round optima is limited, the method leads to a constant regret bound. In the general case, the online Newton's method outperforms online convex optimization algorithms for convex functions and performs similarly to a specialized algorithm for strongly convex functions. We simulate the performance of the online Newton's method on a nonlinear, nonconvex moving target localization example and find that it outperforms a first-order approach.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源