论文标题

基于内核的L_2-促进结构约束

Kernel-based L_2-Boosting with Structure Constraints

论文作者

Wang, Yao, Guo, Xin, Lin, Shao-Bo

论文摘要

在过去的十年中,开发有效的回归核方法非常受欢迎。在本文中,利用基于内核的弱学习者的提升,我们提出了一种基于内核的新型学习算法,称为基于内核的重新缩放的增强,并以截断为基础,称为Kreboot。拟议的Kreboot在控制估计器结构和产生稀疏估计的结构方面有益,并且几乎耐抗性。我们同时进行理论分析和数值模拟,以说明克雷布特的力量。从理论上讲,我们证明Kreboot可以实现非线性近似值几乎最佳的数值收敛速率。此外,使用最近开发的积分操作员方法和Talagrand集中不平等的变体,我们为Kreboot提供快速学习率,Kreboot是增强型算法的新记录。从数值上讲,我们进行了一系列模拟,以显示Kreboot的良好概括,几乎过度拟合的电阻和结构约束的有希望的表现。

Developing efficient kernel methods for regression is very popular in the past decade. In this paper, utilizing boosting on kernel-based weaker learners, we propose a novel kernel-based learning algorithm called kernel-based re-scaled boosting with truncation, dubbed as KReBooT. The proposed KReBooT benefits in controlling the structure of estimators and producing sparse estimate, and is near overfitting resistant. We conduct both theoretical analysis and numerical simulations to illustrate the power of KReBooT. Theoretically, we prove that KReBooT can achieve the almost optimal numerical convergence rate for nonlinear approximation. Furthermore, using the recently developed integral operator approach and a variant of Talagrand's concentration inequality, we provide fast learning rates for KReBooT, which is a new record of boosting-type algorithms. Numerically, we carry out a series of simulations to show the promising performance of KReBooT in terms of its good generalization, near over-fitting resistance and structure constraints.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源