论文标题

用于在高能量物理中应用机器学习中的高参数优化的进化算法

Evolutionary algorithms for hyperparameter optimization in machine learning for application in high energy physics

论文作者

Tani, Laurits, Rand, Diana, Veelken, Christian, Kadastik, Mario

论文摘要

大量数据的分析构成了现代高能物理实验的主要挑战。通常采用基于模拟数据培训的机器学习(ML)方法通常用于促进此任务。训练ML算法时,用户需要做出几种选择。除了确定要使用哪种ML算法并选择合适的可观察物作为输入之外,用户通常需要在多种算法特定的参数中进行选择。我们将需要由用户选择的参数称为超参数。这些与ML算法在培训期间自主学习的参数应区分,而无需用户干预。通常,超参数的选择是由用户手动完成的,并且通常对ML算法的性能产生重大影响。在本文中,我们探讨了两种进化算法:粒子群优化(PSO)和遗传算法(GA),以便以自主方式进行最佳超参数值的选择。这两种算法将在不同的数据集上进行测试,并将其与替代方法进行比较。

The analysis of vast amounts of data constitutes a major challenge in modern high energy physics experiments. Machine learning (ML) methods, typically trained on simulated data, are often employed to facilitate this task. Several choices need to be made by the user when training the ML algorithm. In addition to deciding which ML algorithm to use and choosing suitable observables as inputs, users typically need to choose among a plethora of algorithm-specific parameters. We refer to parameters that need to be chosen by the user as hyperparameters. These are to be distinguished from parameters that the ML algorithm learns autonomously during the training, without intervention by the user. The choice of hyperparameters is conventionally done manually by the user and often has a significant impact on the performance of the ML algorithm. In this paper, we explore two evolutionary algorithms: particle swarm optimization (PSO) and genetic algorithm (GA), for the purposes of performing the choice of optimal hyperparameter values in an autonomous manner. Both of these algorithms will be tested on different datasets and compared to alternative methods.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源