论文标题

深度学习是奇异的,这很好

Deep Learning is Singular, and That's Good

论文作者

Murfet, Daniel, Wei, Susan, Gong, Mingming, Li, Hui, Gell-Redman, Jesse, Quella, Thomas

论文摘要

在奇异模型中,最佳参数集形成了一个具有奇异性的分析集,经典的统计推断不能应用于此类模型。这对于深度学习很重要,因为神经网络是奇异的,因此被黑森的决定因素或采用拉普拉斯近似是不合适的。尽管它有潜力解决深度学习中的基本问题,但奇异的学习理论似乎几乎没有涉及深度学习理论的经典。通过理论和实验的混合,我们提出了奇异学习理论的邀请,作为理解深度学习的工具,并提出了重要的未来工作,以使奇异学习理论直接适用于在实践中如何进行深度学习。

In singular models, the optimal set of parameters forms an analytic set with singularities and classical statistical inference cannot be applied to such models. This is significant for deep learning as neural networks are singular and thus "dividing" by the determinant of the Hessian or employing the Laplace approximation are not appropriate. Despite its potential for addressing fundamental issues in deep learning, singular learning theory appears to have made little inroads into the developing canon of deep learning theory. Via a mix of theory and experiment, we present an invitation to singular learning theory as a vehicle for understanding deep learning and suggest important future work to make singular learning theory directly applicable to how deep learning is performed in practice.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源