论文标题

重新训练还是不重新培训? - 深CNN网络的有效修剪方法

Retrain or not retrain? -- efficient pruning methods of deep CNN networks

论文作者

Pietron, Marcin, Wielgosz, Maciej

论文摘要

卷积神经网络(CNN)在图像处理任务中起着重要作用,例如图像分类,对象检测,语义分割。 CNN网络经常具有几个堆叠的层,并具有几兆字节的重量。降低复杂性和记忆足迹的可能方法之一是修剪。修剪是删除权重的过程,这些权重连接了网络中两个相邻层的神经元。当DL模型具有较高的卷积层时,可以更复杂地找到具有指定准确性下降的最佳解决方案的过程。在本文中,几种基于重新培训和未进行再进行的方法被描述和比较。

Convolutional neural networks (CNN) play a major role in image processing tasks like image classification, object detection, semantic segmentation. Very often CNN networks have from several to hundred stacked layers with several megabytes of weights. One of the possible methods to reduce complexity and memory footprint is pruning. Pruning is a process of removing weights which connect neurons from two adjacent layers in the network. The process of finding near optimal solution with specified drop in accuracy can be more sophisticated when DL model has higher number of convolutional layers. In the paper few approaches based on retraining and no retraining are described and compared together.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源