论文标题
多项式的浅神经网络表示
Shallow neural network representation of polynomials
论文作者
论文摘要
我们表明,$ d $ r $ $ r $的多项式可以代表$ [0,1]^d $作为宽度$ 2(r+d)^d $的浅神经网络。同样,通过SNN表示单变量$ c^β$ -Smooth函数的局部泰勒多项式,我们为浅网络得出了最小收敛的最佳收敛速率,最多是对数因子,以达到未知的单变量回归功能。
We show that $d$-variate polynomials of degree $R$ can be represented on $[0,1]^d$ as shallow neural networks of width $2(R+d)^d$. Also, by SNN representation of localized Taylor polynomials of univariate $C^β$-smooth functions, we derive for shallow networks the minimax optimal rate of convergence, up to a logarithmic factor, to unknown univariate regression function.