论文标题

无限神经网络的不确定性定量方法的概述

An Overview of Uncertainty Quantification Methods for Infinite Neural Networks

论文作者

Juengermann, Florian, Laasri, Maxime, Merkle, Marius

论文摘要

为了更好地了解大型神经网络的理论行为,几项作品分析了网络宽度趋于无穷大的情况。在这种制度中,可以用高斯过程和神经切线核等分析工具正式表达随机初始化和训练神经网络的效果。在本文中,我们回顾了量化这种无限宽度神经网络中不确定性的方法,并将其与贝叶斯推理框架中的高斯过程进行比较。我们在此过程中利用几个等价结果来获得精确的封闭形式解决方案,以实现预测不确定性。

To better understand the theoretical behavior of large neural networks, several works have analyzed the case where a network's width tends to infinity. In this regime, the effect of random initialization and the process of training a neural network can be formally expressed with analytical tools like Gaussian processes and neural tangent kernels. In this paper, we review methods for quantifying uncertainty in such infinite-width neural networks and compare their relationship to Gaussian processes in the Bayesian inference framework. We make use of several equivalence results along the way to obtain exact closed-form solutions for predictive uncertainty.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源