论文标题
进化回声州网络:在傅立叶空间中不断发展的水库
Evolutionary Echo State Network: evolving reservoirs in the Fourier space
论文作者
论文摘要
回声状态网络(ESN)是一类复发性神经网络,具有大量隐藏的隐藏权重(在所谓的储层中)。由于它们在非线性动力学系统的建模方面取得了显着的成功,因此Canonical ESN及其变化最近受到了极大的关注。储层随机与固定权重随机连接,这些权重不会改变学习过程。训练有训练,只有储层到输出的权重。由于储层在训练过程中是固定的,因此我们可能想知道是否完全利用了复发结构的计算能力。在本文中,我们提出了一种新的ESN类型计算模型,该模型代表傅立叶空间中的储层权重,并对这些权重进行微调,该权重应用了频域中的遗传算法。主要兴趣是,与经典ESN相比,该过程将在小得多的空间中起作用,从而提供了初始方法的降低性降低转换。提出的技术使我们能够利用大型复发结构的益处,以避免基于梯度的方法的训练问题。我们提供了一项详细的实验研究,该研究证明了我们使用众所周知的混沌系统和现实数据的良好表现。
The Echo State Network (ESN) is a class of Recurrent Neural Network with a large number of hidden-hidden weights (in the so-called reservoir). Canonical ESN and its variations have recently received significant attention due to their remarkable success in the modeling of non-linear dynamical systems. The reservoir is randomly connected with fixed weights that don't change in the learning process. Only the weights from reservoir to output are trained. Since the reservoir is fixed during the training procedure, we may wonder if the computational power of the recurrent structure is fully harnessed. In this article, we propose a new computational model of the ESN type, that represents the reservoir weights in the Fourier space and performs a fine-tuning of these weights applying genetic algorithms in the frequency domain. The main interest is that this procedure will work in a much smaller space compared to the classical ESN, thus providing a dimensionality reduction transformation of the initial method. The proposed technique allows us to exploit the benefits of the large recurrent structure avoiding the training problems of gradient-based method. We provide a detailed experimental study that demonstrates the good performances of our approach with well-known chaotic systems and real-world data.