论文标题
基于转移学习的搜索空间设计用于超参数调整
Transfer Learning based Search Space Design for Hyperparameter Tuning
论文作者
论文摘要
随着机器学习(ML)模型已广泛应用于数据挖掘应用程序,高参数的调整变得越来越重要。在各种方法中,贝叶斯优化(BO)是一种自动调整超参数的成功方法。尽管传统方法隔离地优化了每个调整任务,但最近一直有兴趣通过在先前的任务中转移知识来加速BO。在这项工作中,我们介绍了一种自动方法来设计BO搜索空间,借助过去任务调整历史记录。这种简单但有效的方法可用于将许多现有的BO方法赋予转移学习能力。此外,它具有三个优点:普遍性,一般性和安全性。广泛的实验表明,我们的方法通过设计有前途且紧凑的搜索空间而不是使用整个空间来大大提高BO,并在广泛的基准上超越最先进的空间,包括机器学习和深度学习调谐任务,以及神经体系结构搜索。
The tuning of hyperparameters becomes increasingly important as machine learning (ML) models have been extensively applied in data mining applications. Among various approaches, Bayesian optimization (BO) is a successful methodology to tune hyper-parameters automatically. While traditional methods optimize each tuning task in isolation, there has been recent interest in speeding up BO by transferring knowledge across previous tasks. In this work, we introduce an automatic method to design the BO search space with the aid of tuning history from past tasks. This simple yet effective approach can be used to endow many existing BO methods with transfer learning capabilities. In addition, it enjoys the three advantages: universality, generality, and safeness. The extensive experiments show that our approach considerably boosts BO by designing a promising and compact search space instead of using the entire space, and outperforms the state-of-the-arts on a wide range of benchmarks, including machine learning and deep learning tuning tasks, and neural architecture search.