论文标题

Genecai:获得紧凑型AI的遗传进化

GeneCAI: Genetic Evolution for Acquiring Compact AI

论文作者

Javaheripi, Mojan, Samragh, Mohammad, Javidi, Tara, Koushanfar, Farinaz

论文摘要

在当代的大数据领域中,深度神经网络(DNN)正在发展朝着更复杂的体系结构发展,以达到更高的推理准确性。可以利用模型压缩技术在资源有限的移动设备上有效地部署此类计算密集型体系结构。这种方法包括各种需要每层定制的超参数以确保高精度。选择此类超参数非常麻烦,因为相关搜索空间随着模型层的指数增长。本文介绍了Genecai,Genecai是一种新型的优化方法,该方法自动学习如何调整每层压缩超参数。我们设计了一种将压缩DNN编码为基因型空间的射击式翻译方案。根据浮点操作的准确性和数量,使用多目标分数测量每个基因型的最佳性。我们开发自定义的遗传操作,以迭代地发展到最佳帕累托前沿的非主导解决方案,从而捕获模型准确性和复杂性之间的最佳权衡。 Genecai优化方法是高度可扩展的,可以在分布式多GPU平台上实现几乎线性的性能提升。我们的广泛评估表明,Genecai通过寻找位于更好准确性复杂性帕累托曲线的模型来优于DNN压缩中现有的基于规则的增强学习方法。

In the contemporary big data realm, Deep Neural Networks (DNNs) are evolving towards more complex architectures to achieve higher inference accuracy. Model compression techniques can be leveraged to efficiently deploy such compute-intensive architectures on resource-limited mobile devices. Such methods comprise various hyper-parameters that require per-layer customization to ensure high accuracy. Choosing such hyper-parameters is cumbersome as the pertinent search space grows exponentially with model layers. This paper introduces GeneCAI, a novel optimization method that automatically learns how to tune per-layer compression hyper-parameters. We devise a bijective translation scheme that encodes compressed DNNs to the genotype space. The optimality of each genotype is measured using a multi-objective score based on accuracy and number of floating point operations. We develop customized genetic operations to iteratively evolve the non-dominated solutions towards the optimal Pareto front, thus, capturing the optimal trade-off between model accuracy and complexity. GeneCAI optimization method is highly scalable and can achieve a near-linear performance boost on distributed multi-GPU platforms. Our extensive evaluations demonstrate that GeneCAI outperforms existing rule-based and reinforcement learning methods in DNN compression by finding models that lie on a better accuracy-complexity Pareto curve.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源