论文标题
单调神经网络的大小和深度:插值和近似
Size and depth of monotone neural networks: interpolation and approximation
论文作者
论文摘要
我们研究具有阈值门的单调神经网络,其中所有权重(偏见以外)都是不负的。我们专注于此类网络表示的表达能力和效率。我们的第一个结果表明,每个单调函数都超过$ [0,1]^d $,可以通过DEPTH-4单调网络在任意小的添加误差中近似。当$ d> 3 $时,我们会改进具有深度$ d+1 $的先前最著名的结构。我们的证明是通过使用DEPTH-4单调阈值网络解决单调数据集的单调插值问题。在我们的第二个主要结果中,我们将单调和任意神经网络之间的尺寸界限与阈值门比较。我们发现,有单调的真实函数可以通过网络有效地计算,而在门上无限制,而近似这些功能的单调网络则需要在维度中的指数大小。
We study monotone neural networks with threshold gates where all the weights (other than the biases) are non-negative. We focus on the expressive power and efficiency of representation of such networks. Our first result establishes that every monotone function over $[0,1]^d$ can be approximated within arbitrarily small additive error by a depth-4 monotone network. When $d > 3$, we improve upon the previous best-known construction which has depth $d+1$. Our proof goes by solving the monotone interpolation problem for monotone datasets using a depth-4 monotone threshold network. In our second main result we compare size bounds between monotone and arbitrary neural networks with threshold gates. We find that there are monotone real functions that can be computed efficiently by networks with no restriction on the gates whereas monotone networks approximating these functions need exponential size in the dimension.