论文标题
广泛的深神经网络的优化景观是良性的
Optimization Landscapes of Wide Deep Neural Networks Are Benign
论文作者
论文摘要
我们通过广泛的网络分析了深度学习的优化景观。我们强调了约束对此类网络的重要性,并表明约束(以及不施状)在此类网络上最小化的经验风险最小化没有限制点,即难以摆脱的次优参数。因此,我们的理论证实了一种普遍的信念,即广泛的神经网络不仅具有高度表达性,而且相当容易优化。
We analyze the optimization landscapes of deep learning with wide networks. We highlight the importance of constraints for such networks and show that constraint -- as well as unconstraint -- empirical-risk minimization over such networks has no confined points, that is, suboptimal parameters that are difficult to escape from. Hence, our theories substantiate the common belief that wide neural networks are not only highly expressive but also comparably easy to optimize.