论文标题

在神经网络中学习不变的权重

Learning Invariant Weights in Neural Networks

论文作者

van der Ouderaa, Tycho F. A., van der Wilk, Mark

论文摘要

关于数据中的不变或对称性的假设可以显着提高统计模型的预测能力。机器学习中的许多常用模型都受到限制,以尊重数据中的某些对称性,例如卷积神经网络中的翻译等效性,并且正在积极研究新的对称类型的合并。然而,从数据本身中学习此类不变的努力仍然是一个开放的研究问题。已经表明,边际可能性提供了一种在高斯过程中学习不变的原则方法。我们提出了一个相当于这种方法的权重空间,它通过最大程度地降低了在神经网络中学习不变的可能性的下限,从而自然具有更高的性能模型。

Assumptions about invariances or symmetries in data can significantly increase the predictive power of statistical models. Many commonly used models in machine learning are constraint to respect certain symmetries in the data, such as translation equivariance in convolutional neural networks, and incorporation of new symmetry types is actively being studied. Yet, efforts to learn such invariances from the data itself remains an open research problem. It has been shown that marginal likelihood offers a principled way to learn invariances in Gaussian Processes. We propose a weight-space equivalent to this approach, by minimizing a lower bound on the marginal likelihood to learn invariances in neural networks resulting in naturally higher performing models.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源