论文标题

随机神经网络和水库系统的近似范围

Approximation Bounds for Random Neural Networks and Reservoir Systems

论文作者

Gonon, Lukas, Grigoryeva, Lyudmila, Ortega, Juan-Pablo

论文摘要

这项工作研究了基于具有随机生成内部权重的单层层进料和复发性神经网络的近似。这些方法仅在广泛的静态和动态学习问题中成功地应用了这些方法,其中只有最后一层的权重和一些超参数得到了优化。尽管这种方法在经验任务中很普遍,但有关未知功能,权重分布和近似率之间关系的重要理论问题仍然开放。在这项工作中,事实证明,只要未知功能,功能或动态系统足够规律,就可以从通用分布(不取决于未知对象)中绘制随机(经常性)神经网络的内部权重,并根据神经元数量和超参数量化误差。特别是,这证明具有随机生成权重的回声状态网络能够近似于任意良好的一类动态系统,因此为他们在学习动态系统方面的经验观察到的成功提供了第一个数学解释。

This work studies approximation based on single-hidden-layer feedforward and recurrent neural networks with randomly generated internal weights. These methods, in which only the last layer of weights and a few hyperparameters are optimized, have been successfully applied in a wide range of static and dynamic learning problems. Despite the popularity of this approach in empirical tasks, important theoretical questions regarding the relation between the unknown function, the weight distribution, and the approximation rate have remained open. In this work it is proved that, as long as the unknown function, functional, or dynamical system is sufficiently regular, it is possible to draw the internal weights of the random (recurrent) neural network from a generic distribution (not depending on the unknown object) and quantify the error in terms of the number of neurons and the hyperparameters. In particular, this proves that echo state networks with randomly generated weights are capable of approximating a wide class of dynamical systems arbitrarily well and thus provides the first mathematical explanation for their empirically observed success at learning dynamical systems.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源