论文标题
Blox:宏神经架构搜索基准和算法
BLOX: Macro Neural Architecture Search Benchmark and Algorithms
论文作者
论文摘要
神经建筑搜索(NAS)已成功地用于设计众多高性能神经网络。但是,NAS通常是计算密集型的,因此大多数现有方法都限制了搜索以决定单个块的操作和拓扑结构,然后将同一块反复堆叠以形成端到端模型。尽管这种方法减少了搜索空间的大小,但最近的研究表明,允许模型中块不同的宏搜索空间可以带来更好的性能。为了对NAS算法在宏观搜索空间上的性能进行系统的研究,我们发布了Blox - 一种基准,该基准由在CIFAR -100数据集中训练的91K独特模型组成。该数据集还包括各种硬件平台上所有模型的运行时测量。我们执行了广泛的实验,以比较在基于单元的搜索空间上对现有的算法进行了很好的研究,并采用了新兴的块方法,旨在使NAS可扩展到更大的宏观搜索空间。基准和代码可在https://github.com/samsunglabs/blox上获得。
Neural architecture search (NAS) has been successfully used to design numerous high-performance neural networks. However, NAS is typically compute-intensive, so most existing approaches restrict the search to decide the operations and topological structure of a single block only, then the same block is stacked repeatedly to form an end-to-end model. Although such an approach reduces the size of search space, recent studies show that a macro search space, which allows blocks in a model to be different, can lead to better performance. To provide a systematic study of the performance of NAS algorithms on a macro search space, we release Blox - a benchmark that consists of 91k unique models trained on the CIFAR-100 dataset. The dataset also includes runtime measurements of all the models on a diverse set of hardware platforms. We perform extensive experiments to compare existing algorithms that are well studied on cell-based search spaces, with the emerging blockwise approaches that aim to make NAS scalable to much larger macro search spaces. The benchmark and code are available at https://github.com/SamsungLabs/blox.