论文标题

噪声注入淋巴结正规化用于健壮的学习

Noise Injection Node Regularization for Robust Learning

论文作者

Levi, Noam, Bloch, Itay M., Freytsis, Marat, Volansky, Tomer

论文摘要

我们引入了噪声注入淋巴结正则化(NINR),这是一种在训练阶段将结构化噪声注入深神经网络(DNN)的方法,从而产生了新兴的正则化效果。我们提供了理论和经验证据,证明了在忍者下训练时,对馈送前向DNN的各种测试数据扰动的鲁棒性大大改善。我们方法中的新颖性来自自适应噪声注入和初始化条件的相互作用,因此噪声是训练开始时动态的主要驱动力。因为它只是需要在不更改现有网络结构或优化算法的情况下添加外部节点,因此可以轻松地将此方法纳入许多标准问题规范中。我们发现针对许多数据扰动,包括域移动,对非结构化噪声的改进得到了改善,在某些情况下,我们的技术优于其他现有方法,例如辍学或$ l_2 $正则化。我们进一步表明,通常维持对清洁数据上理想的概括属性。

We introduce Noise Injection Node Regularization (NINR), a method of injecting structured noise into Deep Neural Networks (DNN) during the training stage, resulting in an emergent regularizing effect. We present theoretical and empirical evidence for substantial improvement in robustness against various test data perturbations for feed-forward DNNs when trained under NINR. The novelty in our approach comes from the interplay of adaptive noise injection and initialization conditions such that noise is the dominant driver of dynamics at the start of training. As it simply requires the addition of external nodes without altering the existing network structure or optimization algorithms, this method can be easily incorporated into many standard problem specifications. We find improved stability against a number of data perturbations, including domain shifts, with the most dramatic improvement obtained for unstructured noise, where our technique outperforms other existing methods such as Dropout or $L_2$ regularization, in some cases. We further show that desirable generalization properties on clean data are generally maintained.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源