论文标题
加速加速施瓦茨方法,用于自适应重新启动的凸优化
Accelerated Additive Schwarz Methods for Convex Optimization with Adaptive Restart
论文作者
论文摘要
基于一个观察结果,即用于一般凸优化的添加剂Schwarz方法可以解释为梯度方法,我们提出了一种用于加法Schwarz方法的加速方案。采用用于梯度方法(例如动量和自适应重新启动)开发的加速技术,加性Schwarz方法的收敛速率得到了极大的提高。提出的加速方案不需要任何有关目标能量功能的平稳性和清晰度水平的先验信息,因此可以将其应用于各种凸优化问题。提供线性椭圆问题,非线性椭圆问题,非平滑问题和非弓形问题的数值结果,以突出提出的方案的优越性和广泛的适用性。
Based on an observation that additive Schwarz methods for general convex optimization can be interpreted as gradient methods, we propose an acceleration scheme for additive Schwarz methods. Adopting acceleration techniques developed for gradient methods such as momentum and adaptive restarting, the convergence rate of additive Schwarz methods is greatly improved. The proposed acceleration scheme does not require any a priori information on the levels of smoothness and sharpness of a target energy functional, so that it can be applied to various convex optimization problems. Numerical results for linear elliptic problems, nonlinear elliptic problems, nonsmooth problems, and nonsharp problems are provided to highlight the superiority and the broad applicability of the proposed scheme.