论文标题

张量鲁棒pca具有非凸和非局部正则化

Tensor Robust PCA with Nonconvex and Nonlocal Regularization

论文作者

Geng, Xiaoyu, Guo, Qiang, Hui, Shuaixiong, Yang, Ming, Zhang, Caiming

论文摘要

张量鲁棒的主成分分析(TRPCA)是一种低量张量恢复的经典方式,它通过平等地缩小每个张量奇异值来最大程度地减少张量级别的凸替代物。但是,对于现实世界的视觉数据,较大的奇异值比小奇异值更重要的信息。在本文中,我们提出了一个基于张量可调的对数规范的非凸TRPCA(N-TRPCA)模型。与TRPCA不同,我们的N-TRPCA可以更适应地缩小小奇异值,而缩小奇异值的缩小。此外,TRPCA假定整个数据张量为低等级。对于自然视觉数据,实际上几乎无法满足此假设,从而限制了TRPCA从嘈杂的图像和视频中恢复边缘和纹理细节的能力。为此,我们将非局部自相似性集成到N-TRPCA中,并进一步开发非convex和非局部TRPCA(NN-TRPCA)模型。具体而言,将类似的非局部贴片分组为张量,然后由我们的N-TRPCA恢复每个组张量。由于一组中的贴剂高度相关,因此所有组张量具有强大的低级别性能,从而改善了恢复性能。实验结果表明,所提出的NN-TRPCA在视觉数据恢复中优于现有的TRPCA方法。该演示代码可在https://github.com/qguo2010/nn-trpca上找到。

Tensor robust principal component analysis (TRPCA) is a classical way for low-rank tensor recovery, which minimizes the convex surrogate of tensor rank by shrinking each tensor singular value equally. However, for real-world visual data, large singular values represent more significant information than small singular values. In this paper, we propose a nonconvex TRPCA (N-TRPCA) model based on the tensor adjustable logarithmic norm. Unlike TRPCA, our N-TRPCA can adaptively shrink small singular values more and shrink large singular values less. In addition, TRPCA assumes that the whole data tensor is of low rank. This assumption is hardly satisfied in practice for natural visual data, restricting the capability of TRPCA to recover the edges and texture details from noisy images and videos. To this end, we integrate nonlocal self-similarity into N-TRPCA, and further develop a nonconvex and nonlocal TRPCA (NN-TRPCA) model. Specifically, similar nonlocal patches are grouped as a tensor and then each group tensor is recovered by our N-TRPCA. Since the patches in one group are highly correlated, all group tensors have strong low-rank property, leading to an improvement of recovery performance. Experimental results demonstrate that the proposed NN-TRPCA outperforms existing TRPCA methods in visual data recovery. The demo code is available at https://github.com/qguo2010/NN-TRPCA.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源