论文标题
学习具有资格痕迹的精确尖峰时间
Learning Precise Spike Timings with Eligibility Traces
论文作者
论文摘要
在尖峰神经网络(SNN)领域的最新研究表明,可以通过与LSTMS一样有效的误差梯度来训练SNN的复发变体,即短期SNNS(LSNNS)。基础学习方法(E-PROP)基于应用于泄漏整合和火(LIF)神经元的合格痕迹的形式化。在这里,我们表明所提出的方法无法完全展开峰值时机依赖性可塑性(STDP)。结果,这原则上限制了SNN的固有优势,即开发依赖于精确的相对尖峰时机的代码的潜力。我们表明,当在izhikevich模型的示例中,在稍微复杂的尖峰神经元模型中得出了E-Prop的资格方程式中的STDP感知突触梯度。我们还提供了提供类似梯度的LIF模型的简单扩展。在一个简单的实验中,我们证明了STDP感知的LIF神经元可以从基于E-Prop的梯度信号中学习精确的尖峰时间。
Recent research in the field of spiking neural networks (SNNs) has shown that recurrent variants of SNNs, namely long short-term SNNs (LSNNs), can be trained via error gradients just as effective as LSTMs. The underlying learning method (e-prop) is based on a formalization of eligibility traces applied to leaky integrate and fire (LIF) neurons. Here, we show that the proposed approach cannot fully unfold spike timing dependent plasticity (STDP). As a consequence, this limits in principle the inherent advantage of SNNs, that is, the potential to develop codes that rely on precise relative spike timings. We show that STDP-aware synaptic gradients naturally emerge within the eligibility equations of e-prop when derived for a slightly more complex spiking neuron model, here at the example of the Izhikevich model. We also present a simple extension of the LIF model that provides similar gradients. In a simple experiment we demonstrate that the STDP-aware LIF neurons can learn precise spike timings from an e-prop-based gradient signal.