论文标题
关于有限时间收敛梯度流的Euler离散化的收敛性
On The Convergence of Euler Discretization of Finite-Time Convergent Gradient Flows
论文作者
论文摘要
在这项研究中,我们研究了两种新型的一阶优化算法的性能,即重新划分梯度流(RGF)和签名的梯度流(SGF)。这些算法源自有限时间收敛流的正向Euler离散化,该流量由非lipschitz动力学系统组成,该系统在局部收敛到梯度为主导函数的最小值。我们首先表征连续流与离散化之间的亲密关系,然后我们继续呈现离散算法(在一般和随机情况下)的(线性)收敛保证。此外,如果问题参数仍然未知或表现出不均匀性,我们将线路搜索策略与RGF/SGF进行了整合,并在这种情况下提供收敛分析。然后,我们将提出的算法应用于学术示例和深度神经网络培训,我们的结果表明,我们的方案表明了针对标准优化替代方案的更快收敛。
In this study, we investigate the performance of two novel first-order optimization algorithms, namely the rescaled-gradient flow (RGF) and the signed-gradient flow (SGF). These algorithms are derived from the forward Euler discretization of finite-time convergent flows, comprised of non-Lipschitz dynamical systems, which locally converge to the minima of gradient-dominated functions. We first characterize the closeness between the continuous flows and the discretizations, then we proceed to present (linear) convergence guarantees of the discrete algorithms (in the general and the stochastic case). Furthermore, in cases where problem parameters remain unknown or exhibit non-uniformity, we further integrate the line-search strategy with RGF/SGF and provide convergence analysis in this setting. We then apply the proposed algorithms to academic examples and deep neural network training, our results show that our schemes demonstrate faster convergences against standard optimization alternatives.