论文标题
基于Rademacher复杂性和香农熵的AI不确定性理论
A Theory on AI Uncertainty Based on Rademacher Complexity and Shannon Entropy
论文作者
论文摘要
在本文中,我们介绍了基于经典的Rademacher复杂性和香农熵的AI深度学习神经网络不确定性研究的理论讨论。首先,这表明经典的Rademacher复杂性和香农熵与定义的数量密切相关。其次,基于香农数学理论[3],我们得出了一个标准,以确保AI分类问题的正确性和准确性。最后但并非最不重要的一点是基于彼得·巴勒特(Peter Barlette)的作品,我们既表现出轻松的状况,又显示出更严格的条件,以确保AI分类的正确性和准确性。通过基于香农理论的香农熵阐明条件,在本文标准条件下阐明,从其他复杂性测量(例如vapnik-cheronenkis,高斯复杂性)中,通过利用关系研究的优势来探索其他标准变得更加容易。在本文中得出了Shannon熵的接近0.5标准,用于对分类问题的AI准确性和正确性的理论研究。
In this paper, we present a theoretical discussion on AI deep learning neural network uncertainty investigation based on the classical Rademacher complexity and Shannon entropy. First it is shown that the classical Rademacher complexity and Shannon entropy is closely related by quantity by definitions. Secondly based on the Shannon mathematical theory on communication [3], we derive a criteria to ensure AI correctness and accuracy in classifications problems. Last but not the least based on Peter Barlette's work, we show both a relaxing condition and a stricter condition to guarantee the correctness and accuracy in AI classification . By elucidating in this paper criteria condition in terms of Shannon entropy based on Shannon theory, it becomes easier to explore other criteria in terms of other complexity measurements such as Vapnik-Cheronenkis, Gaussian complexity by taking advantage of the relations studies results in other references. A close to 0.5 criteria on Shannon entropy is derived in this paper for the theoretical investigation of AI accuracy and correctness for classification problems.