论文标题

AML-SVM:支持向量机的自适应多级学习

AML-SVM: Adaptive Multilevel Learning with Support Vector Machines

论文作者

Sadrfaridpour, Ehsan, Palmer, Korey, Safro, Ilya

论文摘要

支持向量机(SVM)是机器学习中使用最广泛和实用优化的分类模型之一,因为它的可解释性和灵活性可产生高质量的结果。但是,大数据对SVM的最复杂但相对较慢的版本(即非线性SVM)施加了一定的困难。非线性SVM求解器的复杂性和内核矩阵中的元素数量四次随训练数据中的样本数量增加。因此,运行时和内存要求都受到负面影响。此外,参数拟合具有额外的内核参数可以调整,从而加剧了运行时。本文提出了针对非线性SVM的自适应多级学习框架,该框架解决了这些挑战,可改善整个改进过程中的分类质量,并利用多线程并行处理来提高性能。在层次学习框架和自适应过程中,参数拟合的集成使不必要的计算大大降低了运行时间,同时增加了整体性能。实验结果表明,与最先进的非线性SVM库相比,跨层次结构中的验证和测试数据的预测差异降低了,并且没有分类质量降低。该代码可在https://github.com/esadr/amlsvm上访问。

The support vector machines (SVM) is one of the most widely used and practical optimization based classification models in machine learning because of its interpretability and flexibility to produce high quality results. However, the big data imposes a certain difficulty to the most sophisticated but relatively slow versions of SVM, namely, the nonlinear SVM. The complexity of nonlinear SVM solvers and the number of elements in the kernel matrix quadratically increases with the number of samples in training data. Therefore, both runtime and memory requirements are negatively affected. Moreover, the parameter fitting has extra kernel parameters to tune, which exacerbate the runtime even further. This paper proposes an adaptive multilevel learning framework for the nonlinear SVM, which addresses these challenges, improves the classification quality across the refinement process, and leverages multi-threaded parallel processing for better performance. The integration of parameter fitting in the hierarchical learning framework and adaptive process to stop unnecessary computation significantly reduce the running time while increase the overall performance. The experimental results demonstrate reduced variance on prediction over validation and test data across levels in the hierarchy, and significant speedup compared to state-of-the-art nonlinear SVM libraries without a decrease in the classification quality. The code is accessible at https://github.com/esadr/amlsvm.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源