论文标题

关于最低规范插值器和正规经验风险最小化器的鲁棒性

On the robustness of minimum norm interpolators and regularized empirical risk minimizers

论文作者

Chinot, Geoffrey, Löffler, Matthias, van de Geer, Sara

论文摘要

本文在线性模型中开发了最小规范插值估计量和正则经验风险最小化(RERS)的一般理论,在存在添加剂,潜在的对抗性,错误的情况下。特别是,没有施加错误的条件。给出了针对预测误差的定量绑定,将其与误差的最小范围内插器的范围的rademacher复杂性以及围绕真实参数围绕subdfientition的最小规范插值器的范围联系起来。为高斯特征和几个规范说明了一般理论:$ \ ell_1 $,$ \ ell_2 $,group lasso and Nuc Norms。如果存在稀疏性或低级别诱导规范,则最小规范插值器和RERM会产生平均噪声水平顺序的预测误差,前提是过度参数化至少比样本数量大于对数因子,并且在RERM的情况下,正规化参数足够小。显示结果几乎最优性的下限与分析相辅相成。

This article develops a general theory for minimum norm interpolating estimators and regularized empirical risk minimizers (RERM) in linear models in the presence of additive, potentially adversarial, errors. In particular, no conditions on the errors are imposed. A quantitative bound for the prediction error is given, relating it to the Rademacher complexity of the covariates, the norm of the minimum norm interpolator of the errors and the size of the subdifferential around the true parameter. The general theory is illustrated for Gaussian features and several norms: The $\ell_1$, $\ell_2$, group Lasso and nuclear norms. In case of sparsity or low-rank inducing norms, minimum norm interpolators and RERM yield a prediction error of the order of the average noise level, provided that the overparameterization is at least a logarithmic factor larger than the number of samples and that, in case of RERM, the regularization parameter is small enough. Lower bounds that show near optimality of the results complement the analysis.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源