论文标题
一种基于条件梯度的方法,用于简单的双重优化,并使用凸低级别的问题
A Conditional Gradient-based Method for Simple Bilevel Optimization with Convex Lower-level Problem
论文作者
论文摘要
在本文中,我们研究了一类二聚体优化问题,也称为简单的双重优化,在此中,我们将平稳的目标函数最小化了另一个凸的约束优化问题的最佳解决方案集。已经开发了几种解决此类问题的迭代方法。 las,它们的收敛保证是对上层物镜的渐近性,或者收敛速度缓慢且亚最佳。为了解决这个问题,在本文中,我们引入了一种新颖的双层优化方法,该方法通过切割平面在局部近似下层问题的解决方案集,然后运行条件梯度更新以减少上层目标。当上层目标是凸面时,我们表明我们的方法需要$ {\ Mathcal {o}}(\ max \ {1/ε_f,1/ε_g\})$迭代才能找到一个$ the $ timimal的解决方案,用于上级目标和$ε_g$ -poptimle-pold-level-poptimal。此外,当上层目标是非convex时,我们的方法需要$ {\ Mathcal {o}}(\ max \ {1/ε_f^2,1/(ε_fim_g)\})$迭代才能查找$(ε_f,ε_g,ε_g)$ - 最佳解决方案。我们还证明,在低级问题的Hölderian误差约束假设下可以保证更强的收敛性。据我们所知,我们的方法实现了所考虑的一类二聚体问题的最著名迭代复杂性。
In this paper, we study a class of bilevel optimization problems, also known as simple bilevel optimization, where we minimize a smooth objective function over the optimal solution set of another convex constrained optimization problem. Several iterative methods have been developed for tackling this class of problems. Alas, their convergence guarantees are either asymptotic for the upper-level objective, or the convergence rates are slow and sub-optimal. To address this issue, in this paper, we introduce a novel bilevel optimization method that locally approximates the solution set of the lower-level problem via a cutting plane, and then runs a conditional gradient update to decrease the upper-level objective. When the upper-level objective is convex, we show that our method requires ${\mathcal{O}}(\max\{1/ε_f,1/ε_g\})$ iterations to find a solution that is $ε_f$-optimal for the upper-level objective and $ε_g$-optimal for the lower-level objective. Moreover, when the upper-level objective is non-convex, our method requires ${\mathcal{O}}(\max\{1/ε_f^2,1/(ε_fε_g)\})$ iterations to find an $(ε_f,ε_g)$-optimal solution. We also prove stronger convergence guarantees under the Hölderian error bound assumption on the lower-level problem. To the best of our knowledge, our method achieves the best-known iteration complexity for the considered class of bilevel problems.