论文标题

Ridgelet先验:贝叶斯神经网络的先验规范的协方差函数方法

The Ridgelet Prior: A Covariance Function Approach to Prior Specification for Bayesian Neural Networks

论文作者

Matsubara, Takuo, Oates, Chris J., Briol, François-Xavier

论文摘要

贝叶斯神经网络试图将神经网络的强烈预测性能与与贝叶斯框架中预测输出相关的不确定性的形式量化结合在一起。但是,目前尚不清楚如何将网络参数赋予先前的分布,该分布有意义地提升到网络的输出空间中。提出了一个可能的解决方案,使用户可以为手头任务提出适当的高斯流程协方差函数。我们的方法构建了网络参数的先验分布,称为Ridgelet Prie,该分布近似于网络输出空间中所述的高斯过程。与神经网络与高斯过程之间的连接的现有工作相反,我们的分析是非诱导的,并提供了有限的样本尺寸误差界。这确立了贝叶斯神经网络可以近似其协方差函数的高斯过程的普遍性。我们的实验评估仅限于概念验证,我们证明了Ridgelet Prior可以表现出在回归问题上的非结构性先验,可以为此提供合适的高斯过程。

Bayesian neural networks attempt to combine the strong predictive performance of neural networks with formal quantification of uncertainty associated with the predictive output in the Bayesian framework. However, it remains unclear how to endow the parameters of the network with a prior distribution that is meaningful when lifted into the output space of the network. A possible solution is proposed that enables the user to posit an appropriate Gaussian process covariance function for the task at hand. Our approach constructs a prior distribution for the parameters of the network, called a ridgelet prior, that approximates the posited Gaussian process in the output space of the network. In contrast to existing work on the connection between neural networks and Gaussian processes, our analysis is non-asymptotic, with finite sample-size error bounds provided. This establishes the universality property that a Bayesian neural network can approximate any Gaussian process whose covariance function is sufficiently regular. Our experimental assessment is limited to a proof-of-concept, where we demonstrate that the ridgelet prior can out-perform an unstructured prior on regression problems for which a suitable Gaussian process prior can be provided.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源