论文标题
与成本相关的超参数的节俭优化
Frugal Optimization for Cost-related Hyperparameters
论文作者
论文摘要
对机器学习算法民主化的需求不断增长,要求以低成本进行超参数优化(HPO)解决方案。许多机器学习算法具有超参数,可能会导致培训成本的差异。但是,在现有的HPO方法中,这种效果在很大程度上被忽略了,这在优化过程中无法正确控制成本。为了解决这个问题,我们开发了一种新的成本果HPO解决方案。解决方案的核心是一种简单但新的随机直接搜索方法,为此,我们证明了$ o(\ frac {\ sqrt {d}}} {\ sqrt {k}}})$的收敛速率和$ o(dε^{-2})$ - 近似值保证总成本。与大型汽车基准的最先进的HPO方法相比,我们提供了强大的经验结果。
The increasing demand for democratizing machine learning algorithms calls for hyperparameter optimization (HPO) solutions at low cost. Many machine learning algorithms have hyperparameters which can cause a large variation in the training cost. But this effect is largely ignored in existing HPO methods, which are incapable to properly control cost during the optimization process. To address this problem, we develop a new cost-frugal HPO solution. The core of our solution is a simple but new randomized direct-search method, for which we prove a convergence rate of $O(\frac{\sqrt{d}}{\sqrt{K}})$ and an $O(dε^{-2})$-approximation guarantee on the total cost. We provide strong empirical results in comparison with state-of-the-art HPO methods on large AutoML benchmarks.