论文标题

基于角度的搜索空间缩小神经建筑搜索

Angle-based Search Space Shrinking for Neural Architecture Search

论文作者

Hu, Yiming, Liang, Yuding, Guo, Zichao, Wan, Ruosi, Zhang, Xiangyu, Wei, Yichen, Gu, Qingyi, Sun, Jian

论文摘要

在这项工作中,我们提出了一种简单而通用的搜索空间缩小方法,称为基于角度的搜索空间收缩(ABS),用于神经体系结构搜索(NAS)。我们的方法逐渐通过删除无主张的候选人来逐步简化原始搜索空间,从而减少现有NAS方法寻找出色体系结构的困难。特别是,我们提出了一个基于角度的指标来指导缩小过程。我们提供了全面的证据,表明,在体重分享的超网中,所提出的指标比基于准确性和基于准确性的指标更稳定和准确,可以预测儿童模型的能力。我们还表明,基于角度的指标可以在训练超网时快速收敛,从而使我们能够有效地获得有希望的缩水搜索空间。 ABS可以轻松地应用于大多数NAS方法(例如Spos,Fairnas,Proxylessnas,Darts和Pdarts)。全面的实验表明,ABS可以通过提供有希望的缩水搜索空间来大大增强现有的NAS方法。

In this work, we present a simple and general search space shrinking method, called Angle-Based search space Shrinking (ABS), for Neural Architecture Search (NAS). Our approach progressively simplifies the original search space by dropping unpromising candidates, thus can reduce difficulties for existing NAS methods to find superior architectures. In particular, we propose an angle-based metric to guide the shrinking process. We provide comprehensive evidences showing that, in weight-sharing supernet, the proposed metric is more stable and accurate than accuracy-based and magnitude-based metrics to predict the capability of child models. We also show that the angle-based metric can converge fast while training supernet, enabling us to get promising shrunk search spaces efficiently. ABS can easily apply to most of NAS approaches (e.g. SPOS, FairNAS, ProxylessNAS, DARTS and PDARTS). Comprehensive experiments show that ABS can dramatically enhance existing NAS approaches by providing a promising shrunk search space.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源