论文标题
单调以外的近端梯度方法
Proximal gradient methods beyond monotony
论文作者
论文摘要
我们解决了复合优化问题,该问题包括最小化平滑和仅较低的半连续函数的总和,而没有任何凸度假设。这些问题的数值解决方案可以通过近端梯度方法获得,该方法通常依赖于线路搜索程序作为全球化机制。我们考虑基于平均优点函数的自适应非单调近端梯度方案,并在弱假设下建立渐近收敛保证,与单调策略相当。还得出了迭代率和平稳性度量的全球最差案例。最后,一个数值示例指示了非单调性和光谱近似的潜力。
We address composite optimization problems, which consist in minimizing the sum of a smooth and a merely lower semicontinuous function, without any convexity assumptions. Numerical solutions of these problems can be obtained by proximal gradient methods, which often rely on a line search procedure as globalization mechanism. We consider an adaptive nonmonotone proximal gradient scheme based on an averaged merit function and establish asymptotic convergence guarantees under weak assumptions, delivering results on par with the monotone strategy. Global worst-case rates for the iterates and a stationarity measure are also derived. Finally, a numerical example indicates the potential of nonmonotonicity and spectral approximations.