论文标题
卷积神经网络修剪使用过滤器衰减
Convolutional Neural Network Pruning Using Filter Attenuation
论文作者
论文摘要
滤波器是卷积神经网络(CNN)中的基本要素。过滤器对应于特征图,并构成了CNN处理的计算和内存需求的主要部分。在过滤器修剪方法中,删除了其所有组件(包括通道和连接)的过滤器。拆卸过滤器可能会导致网络性能发生巨大变化。同样,删除的过滤器不能返回网络结构。我们想在本文中解决这些问题。我们提出了一种基于过滤器衰减的CNN修剪方法,其中弱过滤器未直接去除。相反,弱过滤器被衰减并逐渐去除。在拟议的衰减方法中,较弱的过滤器不会突然删除,并且这些过滤器有可能返回网络。使用CIFAR10图像分类任务的VGG模型评估过滤器衰减方法。仿真结果表明,滤波器衰减具有不同的修剪标准,并且与常规修剪方法相比,获得了更好的结果。
Filters are the essential elements in convolutional neural networks (CNNs). Filters are corresponded to the feature maps and form the main part of the computational and memory requirement for the CNN processing. In filter pruning methods, a filter with all of its components, including channels and connections, are removed. The removal of a filter can cause a drastic change in the network's performance. Also, the removed filters cannot come back to the network structure. We want to address these problems in this paper. We propose a CNN pruning method based on filter attenuation in which weak filters are not directly removed. Instead, weak filters are attenuated and gradually removed. In the proposed attenuation approach, weak filters are not abruptly removed, and there is a chance for these filters to return to the network. The filter attenuation method is assessed using the VGG model for the Cifar10 image classification task. Simulation results show that the filter attenuation works with different pruning criteria, and better results are obtained in comparison with the conventional pruning methods.