论文标题

真正的转变卷积神经网络

Truly shift-invariant convolutional neural networks

论文作者

Chaman, Anadi, Dokmanić, Ivan

论文摘要

由于使用了卷积和合并层,卷积神经网络长期以来被认为是转变的。但是,最近的作品表明,CNN的输出可以随着输入的较小变化而发生重大变化:由于存在下采样(步幅)层的存在引起的问题。现有的解决方案依赖于数据增强或抗缩减,这两种都有局限性,并且两者都不能实现完美的转移不变性。此外,从这些方法获得的收益并未扩展到训练期间未见的图像模式。为了应对这些挑战,我们提出了自适应多相采样(AP),这是一种简单的子采样方案,允许卷积神经网络在移动下的分类性能中达到100%的一致性,而准确性的损失却没有任何损失。借助APS,这些网络甚至在训练前都表现出完美的一致性,这使其成为使卷积神经网络真正转移不变的第一种方法。

Thanks to the use of convolution and pooling layers, convolutional neural networks were for a long time thought to be shift-invariant. However, recent works have shown that the output of a CNN can change significantly with small shifts in input: a problem caused by the presence of downsampling (stride) layers. The existing solutions rely either on data augmentation or on anti-aliasing, both of which have limitations and neither of which enables perfect shift invariance. Additionally, the gains obtained from these methods do not extend to image patterns not seen during training. To address these challenges, we propose adaptive polyphase sampling (APS), a simple sub-sampling scheme that allows convolutional neural networks to achieve 100% consistency in classification performance under shifts, without any loss in accuracy. With APS, the networks exhibit perfect consistency to shifts even before training, making it the first approach that makes convolutional neural networks truly shift-invariant.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源