论文标题

通过活动正则化优化尖峰神经网络的消费

Optimizing the Consumption of Spiking Neural Networks with Activity Regularization

论文作者

Narduzzi, Simon, Bigdeli, Siavash A., Liu, Shih-Chii, Dunbar, L. Andrea

论文摘要

减少能耗是在边缘设备上运行的神经网络模型的关键点。在这方面,减少在边缘硬件加速器上运行的深神经网络(DNN)的多功能(MAC)操作的数量将减少推理期间的能源消耗。尖峰神经网络(SNN)是生物启发的技术的一个例子,可以通过使用二进制激活来进一步节省能量,并在不升高时避免消耗能量。可以通过DNN到SNN转换框架在任务上配置网络,以达到同等准确性,但它们的转换基于速率编码,因此突触操作可能很高。在这项工作中,我们研究了不同的技术来在神经网络激活图上实现稀疏性,并比较了不同训练正规机对优化DNN和SNNS效率的影响。

Reducing energy consumption is a critical point for neural network models running on edge devices. In this regard, reducing the number of multiply-accumulate (MAC) operations of Deep Neural Networks (DNNs) running on edge hardware accelerators will reduce the energy consumption during inference. Spiking Neural Networks (SNNs) are an example of bio-inspired techniques that can further save energy by using binary activations, and avoid consuming energy when not spiking. The networks can be configured for equivalent accuracy on a task through DNN-to-SNN conversion frameworks but their conversion is based on rate coding therefore the synaptic operations can be high. In this work, we look into different techniques to enforce sparsity on the neural network activation maps and compare the effect of different training regularizers on the efficiency of the optimized DNNs and SNNs.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源