论文标题

神经网络平移:培训前筛选最佳稀疏网络

Neural Network Panning: Screening the Optimal Sparse Network Before Training

论文作者

Kang, Xiatao, Li, Ping, Yao, Jiayi, Li, Chengxi

论文摘要

在训练之前对神经网络进行修剪,不仅可以压缩原始模型,还可以加速具有实质应用值的网络培训阶段。当前的工作着重于细粒修剪,该修剪使用指标来计算重量筛查的重量评分,并从初始的单阶修剪到迭代修剪。通过这些作品,我们认为可以将网络修剪总结为权重的表达力传递过程,其中预留的权重将从被删除的力量中占据表达力,以维持原始网络的性能。为了实现最佳的表达力调度,我们在训练名为神经网络Panning之前提出了一种修剪计划,该方案通过多指数和多进程步骤指导表达力转移,并设计了一种基于增强学习的平移代理以使过程自动化过程。实验结果表明,在训练方法之前,平移性能要比各种可用的修剪。

Pruning on neural networks before training not only compresses the original models, but also accelerates the network training phase, which has substantial application value. The current work focuses on fine-grained pruning, which uses metrics to calculate weight scores for weight screening, and extends from the initial single-order pruning to iterative pruning. Through these works, we argue that network pruning can be summarized as an expressive force transfer process of weights, where the reserved weights will take on the expressive force from the removed ones for the purpose of maintaining the performance of original networks. In order to achieve optimal expressive force scheduling, we propose a pruning scheme before training called Neural Network Panning which guides expressive force transfer through multi-index and multi-process steps, and designs a kind of panning agent based on reinforcement learning to automate processes. Experimental results show that Panning performs better than various available pruning before training methods.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源