论文标题
基于重量依赖性通过多标准进行修剪的频道修剪
Channel Pruning via Multi-Criteria based on Weight Dependency
论文作者
论文摘要
频道修剪已经证明了其在压缩交流器中的有效性。在许多相关艺术中,输出特征图的重要性仅由其相关过滤器确定。但是,这些方法忽略了下一层中的一小部分权重,而当特征映射被删除时,这些权重消失了。他们忽略了体重依赖的现象。此外,许多修剪方法仅使用一个标准进行评估,并以试验和错误的方式找到修剪结构和准确性的最佳位置,这可能很耗时。在本文中,我们根据权重依赖性CPMC提出了一种通道修剪算法,该算法可以直接压缩预训练的模型。 CPMC在三个方面(包括其关联的权重,计算成本和参数数量)定义了通道的重要性。根据权重依赖性现象,CPMC通过评估其相关过滤器和下一层中相应的部分权重来获得通道的重要性。然后,CPMC使用全局归一化来实现跨层比较。最后,CPMC按全球排名删除了较少的重要渠道。 CPMC可以在各种图像分类数据集上压缩各种CNN模型,包括VGGNET,RESNET和DENSENET。广泛的实验表明,CPMC的表现明显优于其他实验。
Channel pruning has demonstrated its effectiveness in compressing ConvNets. In many related arts, the importance of an output feature map is only determined by its associated filter. However, these methods ignore a small part of weights in the next layer which disappears as the feature map is removed. They ignore the phenomenon of weight dependency. Besides, many pruning methods use only one criterion for evaluation and find a sweet spot of pruning structure and accuracy in a trial-and-error fashion, which can be time-consuming. In this paper, we proposed a channel pruning algorithm via multi-criteria based on weight dependency, CPMC, which can compress a pre-trained model directly. CPMC defines channel importance in three aspects, including its associated weight value, computational cost, and parameter quantity. According to the phenomenon of weight dependency, CPMC gets channel importance by assessing its associated filter and the corresponding partial weights in the next layer. Then CPMC uses global normalization to achieve cross-layer comparison. Finally, CPMC removes less important channels by global ranking. CPMC can compress various CNN models, including VGGNet, ResNet, and DenseNet on various image classification datasets. Extensive experiments have shown CPMC outperforms the others significantly.