论文标题
量子神经网络的量子激活功能
Quantum activation functions for quantum neural networks
论文作者
论文摘要
预计人工神经网络的领域将从量子计算机的最新发展中受益匪浅。尤其是,量子机学习是一类利用用于创建可训练神经网络的量子的量子算法,它将提供更多的能力来解决诸如模式识别,聚类和机器学习等问题。馈送神经网络的构件由连接到根据任意激活函数激活的输出神经元的一层神经元组成。相应的学习算法以Rosenblatt Perceptron的名义进行。已知具有特定激活功能的量子感知器,但是仍然缺乏实现量子计算机上任意激活功能的一般方法。在这里,我们用量子算法填补了这一空白,该算法能够将任何分析激活函数近似于其功率序列的任何给定顺序。与以前提供不可逆转的基于测量和简化激活功能的建议不同,我们在这里展示了如何将任何分析函数近似于任何必需的准确性,而无需测量编码信息的状态。得益于这种结构的普遍性,任何馈送前向神经网络都可以根据Hornik定理获得通用近似属性。我们的结果重铸了栅极模型量子计算机体系结构中人工神经网络的科学。
The field of artificial neural networks is expected to strongly benefit from recent developments of quantum computers. In particular, quantum machine learning, a class of quantum algorithms which exploit qubits for creating trainable neural networks, will provide more power to solve problems such as pattern recognition, clustering and machine learning in general. The building block of feed-forward neural networks consists of one layer of neurons connected to an output neuron that is activated according to an arbitrary activation function. The corresponding learning algorithm goes under the name of Rosenblatt perceptron. Quantum perceptrons with specific activation functions are known, but a general method to realize arbitrary activation functions on a quantum computer is still lacking. Here we fill this gap with a quantum algorithm which is capable to approximate any analytic activation functions to any given order of its power series. Unlike previous proposals providing irreversible measurement--based and simplified activation functions, here we show how to approximate any analytic function to any required accuracy without the need to measure the states encoding the information. Thanks to the generality of this construction, any feed-forward neural network may acquire the universal approximation properties according to Hornik's theorem. Our results recast the science of artificial neural networks in the architecture of gate-model quantum computers.