论文标题

学习通过展开算法解决电视正规化问题

Learning to solve TV regularized problems with unrolled algorithms

论文作者

Cherkaoui, Hamza, Sulam, Jeremias, Moreau, Thomas

论文摘要

总变化(TV)是一种流行的正规化策略,它通过约束估计信号的一阶导数的$ \ ell_1 $ norm来促进零件恒定信号。最终的优化问题通常使用迭代算法(例如近端梯度下降,原始偶算法或ADMM)解决。但是,这种方法可能需要大量的迭代才能收敛到合适的解决方案。在本文中,我们通过展开近端梯度下降求解器来加速这种迭代算法,以了解其1D电视正则化问题的参数。虽然可以使用合成公式完成此操作,但我们证明这会导致性能较慢。在分析公式中应用此类方法的主要困难在于提出一种通过近端操作员计算衍生物的方法。作为我们的主要贡献,我们开发并表征了两种方法,描述其利益和局限性,并讨论他们实际上可以改善迭代程序的制度。我们通过实验合成和真实数据来验证这些发现。

Total Variation (TV) is a popular regularization strategy that promotes piece-wise constant signals by constraining the $\ell_1$-norm of the first order derivative of the estimated signal. The resulting optimization problem is usually solved using iterative algorithms such as proximal gradient descent, primal-dual algorithms or ADMM. However, such methods can require a very large number of iterations to converge to a suitable solution. In this paper, we accelerate such iterative algorithms by unfolding proximal gradient descent solvers in order to learn their parameters for 1D TV regularized problems. While this could be done using the synthesis formulation, we demonstrate that this leads to slower performances. The main difficulty in applying such methods in the analysis formulation lies in proposing a way to compute the derivatives through the proximal operator. As our main contribution, we develop and characterize two approaches to do so, describe their benefits and limitations, and discuss the regime where they can actually improve over iterative procedures. We validate those findings with experiments on synthetic and real data.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源