论文标题

自适应算法的收敛,用于弱凸的限制优化

Convergence of adaptive algorithms for weakly convex constrained optimization

论文作者

Alacaoglu, Ahmet, Malitsky, Yura, Cevher, Volkan

论文摘要

我们分析了自适应一阶算法AMSGRAD,以用弱凸目标解决约束的随机优化问题。我们证明了$ \ Mathcal {\ tilde o}(t^{ - 1/4})$收敛速率的莫罗信封梯度范围,这是该类别问题的标准平稳性测量。它符合自适应算法对于不受限制的平滑随机优化的特定情况所享有的已知速率。我们的分析可用于$ 1 $的迷你批量尺寸,常数的一阶和二阶矩参数以及可能无限的优化域。最后,我们将结果的应用和扩展为特定的问题和算法。

We analyze the adaptive first order algorithm AMSGrad, for solving a constrained stochastic optimization problem with a weakly convex objective. We prove the $\mathcal{\tilde O}(t^{-1/4})$ rate of convergence for the norm of the gradient of Moreau envelope, which is the standard stationarity measure for this class of problems. It matches the known rates that adaptive algorithms enjoy for the specific case of unconstrained smooth stochastic optimization. Our analysis works with mini-batch size of $1$, constant first and second order moment parameters, and possibly unbounded optimization domains. Finally, we illustrate the applications and extensions of our results to specific problems and algorithms.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源