论文标题
更快的加速一阶方法,用于凸优化具有强凸功能约束
Faster Accelerated First-order Methods for Convex Optimization with Strongly Convex Function Constraints
论文作者
论文摘要
在本文中,我们介绍了更快的加速二重算法,以最大程度地减少受凸功能约束的凸功能。在我们工作之前,无论约束功能的强凸度如何,最佳的复杂性界限为$ \ MATHCAL {O}(1/{\ VAREPSILON})$。目前尚不清楚强凸度假设是否可以使得更好的收敛结果。为了解决这个问题,我们开发了新颖的技术,以逐步估计拉格朗日功能的强凸性。我们的方法首次有效地利用了约束强凸度,获得了$ \ MATHCAL {O}(1/\ sqrt {\ varepsilon})$的提高复杂性。该速率与强 - 符合符号鞍点优化的复杂性下限相匹配,因此是最佳的。我们在引起稀疏性的受限优化中表明了方法的出色性能,尤其是Google的个性化Pagerank问题。此外,我们表明,提出的方法的重新启动版本可以在有限数量的步骤中有效地识别最佳解决方案的稀疏模式,结果似乎具有独立的意义。
In this paper, we introduce faster accelerated primal-dual algorithms for minimizing a convex function subject to strongly convex function constraints. Prior to our work, the best complexity bound was $\mathcal{O}(1/{\varepsilon})$, regardless of the strong convexity of the constraint function. It is unclear whether the strong convexity assumption can enable even better convergence results. To address this issue, we have developed novel techniques to progressively estimate the strong convexity of the Lagrangian function. Our approach, for the first time, effectively leverages the constraint strong convexity, obtaining an improved complexity of $\mathcal{O}(1/\sqrt{\varepsilon})$. This rate matches the complexity lower bound for strongly-convex-concave saddle point optimization and is therefore order-optimal. We show the superior performance of our methods in sparsity-inducing constrained optimization, notably Google's personalized PageRank problem. Furthermore, we show that a restarted version of the proposed methods can effectively identify the optimal solution's sparsity pattern within a finite number of steps, a result that appears to have independent significance.