论文标题

TINC:树结构的隐式神经压缩

TINC: Tree-structured Implicit Neural Compression

论文作者

Yang, Runzhao, Xiao, Tingxiong, Cheng, Yuxiao, Suo, Jinli, Dai, Qionghai

论文摘要

隐式神经表示(INR)可以使用少量参数来描述具有高保真度的目标场景,并且正在作为一种有前途的数据压缩技术出现。但是,有限的频谱覆盖范围是INR的固有,并且有效地消除各种复杂数据中的冗余是非平凡的。初步研究只能利用目标数据中的全球或局部相关性,从而获得有限的性能。在本文中,我们提出了一个树结构的隐式神经压缩(TINC),以对局部区域进行紧凑的表示,并以层次的方式提取这些局部表示的共享特征。具体而言,我们使用多层感知器(MLP)适合分区的局部区域,并且这些MLP在树结构中组织起来,根据空间距离共享参数。参数共享方案不仅确保相邻区域之间的连续性,而且共同消除了局部和非本地冗余。广泛的实验表明,TINC改善了INR的压缩保真度,并且在商业工具和其他基于深度学习的方法上显示出令人印象深刻的压缩能力。此外,该方法具有很高的灵活性,可以针对不同的数据和参数设置进行量身定制。可以在https://github.com/richealyoung/tinc上找到源代码。

Implicit neural representation (INR) can describe the target scenes with high fidelity using a small number of parameters, and is emerging as a promising data compression technique. However, limited spectrum coverage is intrinsic to INR, and it is non-trivial to remove redundancy in diverse complex data effectively. Preliminary studies can only exploit either global or local correlation in the target data and thus of limited performance. In this paper, we propose a Tree-structured Implicit Neural Compression (TINC) to conduct compact representation for local regions and extract the shared features of these local representations in a hierarchical manner. Specifically, we use Multi-Layer Perceptrons (MLPs) to fit the partitioned local regions, and these MLPs are organized in tree structure to share parameters according to the spatial distance. The parameter sharing scheme not only ensures the continuity between adjacent regions, but also jointly removes the local and non-local redundancy. Extensive experiments show that TINC improves the compression fidelity of INR, and has shown impressive compression capabilities over commercial tools and other deep learning based methods. Besides, the approach is of high flexibility and can be tailored for different data and parameter settings. The source code can be found at https://github.com/RichealYoung/TINC .

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源