论文标题
通过$ \ ell^1 $ - 最小化对平滑度类取样数字
Sampling numbers of smoothness classes via $\ell^1$-minimization
论文作者
论文摘要
使用最近在压缩传感领域开发的技术,我们证明了(Quasi-)Banach平滑度空间($ l^2 $)的一般(非线性)采样数的新上限。特别是,我们表明,在相关情况下,例如混合和各向同性加权的维也纳类别或具有混合光滑度的Sobolev空间,$ l^2 $中的采样数字可以由$ l^\ infty $中的最佳$ n $ term Term-Term Trigonometric witths上限。我们将基于$ \ ell^1 $ minimization(基础追踪Denoising)的$ M $函数值描述恢复过程。通过这种方法,与最近开发的线性恢复方法相比,收敛速率显着提高。在这种确定性的最差案例设置中,我们看到$ M^{ - 1/2} $(最终到日志因子)的额外加速与线性方法相比,在加权维也纳空间中。因为他们的准裔巴纳奇对应物甚至是任意多项式加速。令人惊讶的是,我们的方法允许恢复属于$ s^r_pw(\ Mathbb {t}^d)$的混合平滑度SOBOLEV函数在$ d $ -torus上的$ d $ torus上的收敛速度比任何线性方法都更好,而当$ 1 <p <2 $和$ d $的情况下,收敛速度比任何线性方法都能实现。各向同性Sobolev空间不存在此效果。
Using techniques developed recently in the field of compressed sensing we prove new upper bounds for general (nonlinear) sampling numbers of (quasi-)Banach smoothness spaces in $L^2$. In particular, we show that in relevant cases such as mixed and isotropic weighted Wiener classes or Sobolev spaces with mixed smoothness, sampling numbers in $L^2$ can be upper bounded by best $n$-term trigonometric widths in $L^\infty$. We describe a recovery procedure from $m$ function values based on $\ell^1$-minimization (basis pursuit denoising). With this method, a significant gain in the rate of convergence compared to recently developed linear recovery methods is achieved. In this deterministic worst-case setting we see an additional speed-up of $m^{-1/2}$ (up to log factors) compared to linear methods in case of weighted Wiener spaces. For their quasi-Banach counterparts even arbitrary polynomial speed-up is possible. Surprisingly, our approach allows to recover mixed smoothness Sobolev functions belonging to $S^r_pW(\mathbb{T}^d)$ on the $d$-torus with a logarithmically better rate of convergence than any linear method can achieve when $1 < p < 2$ and $d$ is large. This effect is not present for isotropic Sobolev spaces.