论文标题

培训具有仿生自适应内部关联神经元的更强的尖峰神经网络

Training Stronger Spiking Neural Networks with Biomimetic Adaptive Internal Association Neurons

论文作者

Shen, Haibo, Luo, Yihao, Cao, Xiang, Zhang, Liangqi, Xiao, Juyu, Wang, Tianjiang

论文摘要

作为第三代神经网络,尖峰神经网络(SNN)致力于探索更有见地的神经机制,以实现近生物的智能。直观地,仿生机制对于理解和改善SNN至关重要。例如,关联长期增强(ALTP)现象表明,除了神经元之间的学习机制外,神经元内还有关联作用。但是,大多数现有方法仅关注前者,并且缺乏对内部关联效应的探索。在本文中,我们提出了一种新型的自适应内部关联〜(AIA)神经元模型,以建立先前忽略神经元内的影响。与ALTP现象一致,AIA神经元模型适应输入刺激,并且只有在同时刺激两个树突时,才会发生内部关联学习。此外,我们采用加权权重来测量内部关联并引入中间缓存以减少关联的波动。关于盛行的神经形态数据集的广泛实验表明,该提出的方法可以更具体地增强或降低尖峰的触发,从而使尖峰的性能更高,而尖峰较少。值得注意的是,如果不添加任何推理的任何参数,AIA模型就可以在DVS-CIFAR10〜(83.9 \%)和N-CARS〜(95.64 \%)数据集上实现最先进的性能。

As the third generation of neural networks, spiking neural networks (SNNs) are dedicated to exploring more insightful neural mechanisms to achieve near-biological intelligence. Intuitively, biomimetic mechanisms are crucial to understanding and improving SNNs. For example, the associative long-term potentiation (ALTP) phenomenon suggests that in addition to learning mechanisms between neurons, there are associative effects within neurons. However, most existing methods only focus on the former and lack exploration of the internal association effects. In this paper, we propose a novel Adaptive Internal Association~(AIA) neuron model to establish previously ignored influences within neurons. Consistent with the ALTP phenomenon, the AIA neuron model is adaptive to input stimuli, and internal associative learning occurs only when both dendrites are stimulated at the same time. In addition, we employ weighted weights to measure internal associations and introduce intermediate caches to reduce the volatility of associations. Extensive experiments on prevailing neuromorphic datasets show that the proposed method can potentiate or depress the firing of spikes more specifically, resulting in better performance with fewer spikes. It is worth noting that without adding any parameters at inference, the AIA model achieves state-of-the-art performance on DVS-CIFAR10~(83.9\%) and N-CARS~(95.64\%) datasets.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源