论文标题

扩展广泛的超复杂性神经网络的通用近似定理

Extending the Universal Approximation Theorem for a Broad Class of Hypercomplex-Valued Neural Networks

论文作者

Vital, Wington L., Vieira, Guilherme, Valle, Marcos Eduardo

论文摘要

通用近似定理断言,单个隐藏层神经网络在紧凑型集合上具有任何所需的精度近似连续函数。作为存在结果,通用近似定理支持在各种应用程序中使用神经网络,包括回归和分类任务。通用近似定理不仅限于实现的神经网络,而且还具有复杂,季节,Tessarines和Clifford值的神经网络。本文扩展了广泛的超复杂值神经网络的通用近似定理。确切地说,我们首先介绍非分类超复合代数的概念。复数,偶数和苔丝是非分类超复合代数的例子。然后,我们陈述了在非分类代数上定义的超复合值的神经网络的通用近似定理。

The universal approximation theorem asserts that a single hidden layer neural network approximates continuous functions with any desired precision on compact sets. As an existential result, the universal approximation theorem supports the use of neural networks for various applications, including regression and classification tasks. The universal approximation theorem is not limited to real-valued neural networks but also holds for complex, quaternion, tessarines, and Clifford-valued neural networks. This paper extends the universal approximation theorem for a broad class of hypercomplex-valued neural networks. Precisely, we first introduce the concept of non-degenerate hypercomplex algebra. Complex numbers, quaternions, and tessarines are examples of non-degenerate hypercomplex algebras. Then, we state the universal approximation theorem for hypercomplex-valued neural networks defined on a non-degenerate algebra.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源