论文标题

通过追逐梯度来增强弗兰克·沃尔夫

Boosting Frank-Wolfe by Chasing Gradients

论文作者

Combettes, Cyrille W., Pokutta, Sebastian

论文摘要

Frank-Wolfe算法已成为一种流行的一阶优化算法,因为它简单且没有预测,并且已成功应用于各种现实世界中的问题。但是,其主要缺点在于其收敛速率,由于天真的下降方向可能会太慢。我们建议通过通过子例程更好地将下降方向与负梯度的下降方向更好地对齐,以加快Frank-Wolfe算法的速度。该子例程在匹配的追求风格中追逐负梯度方向,同时仍保留无投影属性。尽管该方法是相当自然的,但它会产生非常重要的结果。我们将收敛率$ \ MATHCAL {O}(1/T)$转换为我们方法的$ \ Mathcal {O}(e^{ - ωt})$,并且我们在一系列计算实验中的最新时间表中,我们在CPU时间内都在CPU时间中演示了其竞争优势。

The Frank-Wolfe algorithm has become a popular first-order optimization algorithm for it is simple and projection-free, and it has been successfully applied to a variety of real-world problems. Its main drawback however lies in its convergence rate, which can be excessively slow due to naive descent directions. We propose to speed up the Frank-Wolfe algorithm by better aligning the descent direction with that of the negative gradient via a subroutine. This subroutine chases the negative gradient direction in a matching pursuit-style while still preserving the projection-free property. Although the approach is reasonably natural, it produces very significant results. We derive convergence rates $\mathcal{O}(1/t)$ to $\mathcal{O}(e^{-ωt})$ of our method and we demonstrate its competitive advantage both per iteration and in CPU time over the state-of-the-art in a series of computational experiments.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源