论文标题

一些适应性不平等的自适应一阶方法,具有相对强烈的单调算子和普遍的平滑度

Some Adaptive First-order Methods for Variational Inequalities with Relatively Strongly Monotone Operators and Generalized Smoothness

论文作者

Titov, A. A., Ablaev, S. S., Alkousa, M. S., Stonyakin, F. S., Gasnikov, A. V.

论文摘要

在本文中,我们介绍了一些适应性方法,以解决具有相对强烈单调算子的变异不平等现象。首先,我们专注于在平滑情况[1]中的最近提出的自适应数值方法上的修改,用于普遍平滑(带有Hölder条件)鞍点问题,该方法的收敛速率估计值类似于加速方法。我们提供了这种方法的动机,并获得了所提出方法的理论结果。我们的第二个重点是对最近提出的广泛提出的方法,用于解决具有相对强烈单调算子的变异不平等的方法。我们方法中的关键思想是拒绝众所周知的重新启动技术,在某些情况下,在某些情况下,在实施适用问题的算法时会遇到困难。然而,基于上述重启技术,我们的算法相对于算法显示了相当的收敛速率。另外,我们提出了一些数值实验,这些实验证明了所提出的方法的有效性。 [1] Jin,Y.,Sidford,A。和Tian,K。(2022)。可分离的最小值和有限总和优化的较高速率通过原始的双重外部方法进行了优化。 ARXIV预印型ARXIV:2202.04640。

In this paper, we introduce some adaptive methods for solving variational inequalities with relatively strongly monotone operators. Firstly, we focus on the modification of the recently proposed, in smooth case [1], adaptive numerical method for generalized smooth (with Hölder condition) saddle point problem, which has convergence rate estimates similar to accelerated methods. We provide the motivation for such an approach and obtain theoretical results of the proposed method. Our second focus is the adaptation of widespread recently proposed methods for solving variational inequalities with relatively strongly monotone operators. The key idea in our approach is the refusal of the well-known restart technique, which in some cases causes difficulties in implementing such algorithms for applied problems. Nevertheless, our algorithms show a comparable rate of convergence with respect to algorithms based on the above-mentioned restart technique. Also, we present some numerical experiments, which demonstrate the effectiveness of the proposed methods. [1] Jin, Y., Sidford, A., & Tian, K. (2022). Sharper rates for separable minimax and finite sum optimization via primal-dual extragradient methods. arXiv preprint arXiv:2202.04640.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源