论文标题
硬币翻转神经网络
Coin Flipping Neural Networks
论文作者
论文摘要
我们表明,具有随机性访问的神经网络可以通过使用放大来胜过确定性网络。我们称之为这些网络硬币融合的神经网络或CFNN。我们表明,CFNN可以将$ d $维球的指标近似于任意准确性,只有2层和$ \ Mathcal {o}(1)$神经元,其中显示了2层确定性网络需要$ω(E^d)$ NEURONS $ NEURONS,一个指标改进(Arxiv:Arxiv:1610.09887)。我们证明了一个高度不平凡的结果,即对于几乎所有分类问题,都存在一个简单的网络,可以解决该网络权重的足够强大的发电机。结合了这些结果,我们猜测,对于大多数分类问题,有一个CFNN可以比任何确定性网络更高的精度或更少的神经元来解决它们。最后,我们使用CIFAR10和CIFAR100上的新型CFNN体系结构在实验中验证了我们的证明,从基线提高了9.25 \%。
We show that neural networks with access to randomness can outperform deterministic networks by using amplification. We call such networks Coin-Flipping Neural Networks, or CFNNs. We show that a CFNN can approximate the indicator of a $d$-dimensional ball to arbitrary accuracy with only 2 layers and $\mathcal{O}(1)$ neurons, where a 2-layer deterministic network was shown to require $Ω(e^d)$ neurons, an exponential improvement (arXiv:1610.09887). We prove a highly non-trivial result, that for almost any classification problem, there exists a trivially simple network that solves it given a sufficiently powerful generator for the network's weights. Combining these results we conjecture that for most classification problems, there is a CFNN which solves them with higher accuracy or fewer neurons than any deterministic network. Finally, we verify our proofs experimentally using novel CFNN architectures on CIFAR10 and CIFAR100, reaching an improvement of 9.25\% from the baseline.