论文标题

随机方差降低了原始双固定点方法,用于线性约束可分离优化

A Stochastic Variance Reduced Primal Dual Fixed Point Method For Linearly Constrained Separable Optimization

论文作者

Zhu, Ya-Nan, Zhang, Xiaoqun

论文摘要

在本文中,我们将随机方差降低梯度(SVRG)方法[17]与[7]中提出的原始双固定点方法(PDFP)结合在一起,以求解两个凸功能的总和,其中一种是线性复合的。这种类型的问题通常是在稀疏信号和图像重建中引起的。所提出的SVRG-PDFP可以看作是Prox-SVRG [37]的概括,最初是为最小化两个凸函数的最小化而设计的。基于一些标准假设,我们提出了两个变体,一个是强凸目标函数,另一个用于一般凸情况。收敛分析表明,对于常规凸目标函数,SVRG-PDFP的收敛速率为O(1/K)(此处为k是迭代编号),而对于k强凸vE的k个线性。提供了机器学习和CT图像重建的数值示例,以显示算法的有效性。

In this paper we combine the stochastic variance reduced gradient (SVRG) method [17] with the primal dual fixed point method (PDFP) proposed in [7] to solve a sum of two convex functions and one of which is linearly composite. This type of problems are typically arisen in sparse signal and image reconstruction. The proposed SVRG-PDFP can be seen as a generalization of Prox-SVRG [37] originally designed for the minimization of a sum of two convex functions. Based on some standard assumptions, we propose two variants, one is for strongly convex objective function and the other is for general convex cases. Convergence analysis shows that the convergence rate of SVRG-PDFP is O(1/k) (here k is the iteration number) for general convex objective function and linear for k strongly convex case. Numerical examples on machine learning and CT image reconstruction are provided to show the effectiveness of the algorithms.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源