论文标题
精选机器学习算法的超参数的行为:实证研究
Behavior of Hyper-Parameters for Selected Machine Learning Algorithms: An Empirical Investigation
论文作者
论文摘要
超参数(HPS)是机器学习(ML)模型开发的重要组成部分,可以极大地影响性能。本文研究了它们的三种算法的行为:具有结构化数据的极端梯度提升(XGB),随机森林(RF)和前馈神经网络(FFNN)。我们的实证研究研究了模型性能随着HP的变化而定性行为,量化了每个HP对不同ML算法的重要性,以及在最佳区域附近性能的稳定性。根据发现,我们通过减少搜索空间提出了一组有效HP调整的准则。
Hyper-parameters (HPs) are an important part of machine learning (ML) model development and can greatly influence performance. This paper studies their behavior for three algorithms: Extreme Gradient Boosting (XGB), Random Forest (RF), and Feedforward Neural Network (FFNN) with structured data. Our empirical investigation examines the qualitative behavior of model performance as the HPs vary, quantifies the importance of each HP for different ML algorithms, and stability of the performance near the optimal region. Based on the findings, we propose a set of guidelines for efficient HP tuning by reducing the search space.