论文标题

TCL:可训练的剪裁层的ANN-SNN转换

TCL: an ANN-to-SNN Conversion with Trainable Clipping Layers

论文作者

Ho, Nguyen-Dong, Chang, Ik-Joon

论文摘要

由于SNN的事件驱动的操作可与模拟神经网络(ANN)(ANN)相比,峰值神经网络(SNNS)在边缘设备上很有希望。尽管很难有效地训练SNN,但是已经开发了许多将训练有素的ANN转换为SNN的技术。但是,转换后,SNN中存在准确性和潜伏期之间的权衡关系,从而在大型数据集(例如ImageNet)中导致相当大的延迟。我们提出了一种称为TCL的技术,以减轻权衡问题,使Imagenet的准确性为73.87%(VGG-16)和70.37%(RESNET-34),用于Imagenet,SNN中有250个周期的中等延迟。

Spiking-neural-networks (SNNs) are promising at edge devices since the event-driven operations of SNNs provides significantly lower power compared to analog-neural-networks (ANNs). Although it is difficult to efficiently train SNNs, many techniques to convert trained ANNs to SNNs have been developed. However, after the conversion, a trade-off relation between accuracy and latency exists in SNNs, causing considerable latency in large size datasets such as ImageNet. We present a technique, named as TCL, to alleviate the trade-off problem, enabling the accuracy of 73.87% (VGG-16) and 70.37% (ResNet-34) for ImageNet with the moderate latency of 250 cycles in SNNs.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源