论文标题

连续级图像处理的更光滑的网络调整和插值

Smoother Network Tuning and Interpolation for Continuous-level Image Processing

论文作者

Lee, Hyeongmin, Kim, Taeoh, Son, Hanbin, Baek, Sangwook, Cheon, Minsu, Lee, Sangyoun

论文摘要

在基于卷积的神经网络(CNN)图像处理中,大多数研究提出的网络被优化为单级(或单目标);因此,它们在其他级别上的表现不佳,必须重新培训才能提供最佳性能。使用多个模型覆盖多个级别涉及非常高的计算成本。为了解决这些问题,最近的方法在两个不同的层面上训练网络,并提出自己的插值方法以实现任意中间级别。但是,他们中的许多人未能在实际用法中概括或产生一定的副作用。在本文中,我们将这些框架定义为网络调整和插值,并为连续学习的新型模块提出了一种称为滤波器过渡网络(FTN)的新模块。该模块在结构上比现有模块更光滑。因此,FTN的框架在各种任务和网络上都很好地概括,并引起更少的不良副作用。对于FTN的稳定学习,我们还提出了一种使用标识映射初始化非线性神经网络层的方法。各种图像处理任务的广泛结果表明,FTN的性能在多个连续级别上是可比的,并且比其他框架的表现更加顺畅,更轻。

In Convolutional Neural Network (CNN) based image processing, most studies propose networks that are optimized to single-level (or single-objective); thus, they underperform on other levels and must be retrained for delivery of optimal performance. Using multiple models to cover multiple levels involves very high computational costs. To solve these problems, recent approaches train networks on two different levels and propose their own interpolation methods to enable arbitrary intermediate levels. However, many of them fail to generalize or have certain side effects in practical usage. In this paper, we define these frameworks as network tuning and interpolation and propose a novel module for continuous-level learning, called Filter Transition Network (FTN). This module is a structurally smoother module than existing ones. Therefore, the frameworks with FTN generalize well across various tasks and networks and cause fewer undesirable side effects. For stable learning of FTN, we additionally propose a method to initialize non-linear neural network layers with identity mappings. Extensive results for various image processing tasks indicate that the performance of FTN is comparable in multiple continuous levels, and is significantly smoother and lighter than that of other frameworks.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源