论文标题
关于具有跳过连接的神经网络修剪的特征图的能量统计
On The Energy Statistics of Feature Maps in Pruning of Neural Networks with Skip-Connections
论文作者
论文摘要
我们基于测量隐藏层和预测输出的统计依赖性,提出了一个新的结构化修剪框架,用于通过跳过连接来压缩深层神经网络(DNN)。由隐藏层的能量统计所定义的依赖度度量是特征图和网络输出之间信息的无模型度量。随后使用估计的依赖度量来修剪冗余层的集合。我们的度量的模型富裕确保不需要特征图分布上的参数假设,从而使其在DNN中具有很高的维度特征空间具有计算吸引力。各种体系结构的广泛数值实验表明,提出的修剪方法具有竞争性能对最先进方法的功效。
We propose a new structured pruning framework for compressing Deep Neural Networks (DNNs) with skip connections, based on measuring the statistical dependency of hidden layers and predicted outputs. The dependence measure defined by the energy statistics of hidden layers serves as a model-free measure of information between the feature maps and the output of the network. The estimated dependence measure is subsequently used to prune a collection of redundant and uninformative layers. Model-freeness of our measure guarantees that no parametric assumptions on the feature map distribution are required, making it computationally appealing for very high dimensional feature space in DNNs. Extensive numerical experiments on various architectures show the efficacy of the proposed pruning approach with competitive performance to state-of-the-art methods.