论文标题
A $ J $ - 对称的准Newton方法,用于minimax问题
A $J$-Symmetric Quasi-Newton Method for Minimax Problems
论文作者
论文摘要
Minimax问题最近在优化和机器学习社区中引起了极大的关注。在本文中,我们引入了一种新的准Newton方法,用于minimax问题,我们称之为$ j $ -smmetric Quasi-Newton方法。该方法是通过利用Minimax问题中目标函数的二阶导数的$ j $ -symmetric结构而获得的。我们表明,可以通过排名2的操作来更新Hessian估计(及其逆),事实证明,更新规则是经典的Powell Symmetric Broyden(PSB)方法的自然概括,从最小化问题到Minimax问题。从理论上讲,我们表明,在标准的规律性条件下,我们提出的准Newton算法享受局部Q-Superlinear的收敛到理想的解决方案。此外,我们介绍了享受全球R-Superlinear融合的算法的信任区域变体。最后,我们提出了数值实验,以验证我们的理论,并显示了与布罗登的方法相比,我们提出的算法的有效性以及三类minimax问题的外部方法和外部方法。
Minimax problems have gained tremendous attentions across the optimization and machine learning community recently. In this paper, we introduce a new quasi-Newton method for minimax problems, which we call $J$-symmetric quasi-Newton method. The method is obtained by exploiting the $J$-symmetric structure of the second-order derivative of the objective function in minimax problem. We show that the Hessian estimation (as well as its inverse) can be updated by a rank-2 operation, and it turns out that the update rule is a natural generalization of the classic Powell symmetric Broyden (PSB) method from minimization problems to minimax problems. In theory, we show that our proposed quasi-Newton algorithm enjoys local Q-superlinear convergence to a desirable solution under standard regularity conditions. Furthermore, we introduce a trust-region variant of the algorithm that enjoys global R-superlinear convergence. Finally, we present numerical experiments that verify our theory and show the effectiveness of our proposed algorithms compared to Broyden's method and the extragradient method on three classes of minimax problems.