论文标题

尺寸增强物理信息的神经网络(DAPINN),具有高水平的准确性和效率

A Dimension-Augmented Physics-Informed Neural Network (DaPINN) with High Level Accuracy and Efficiency

论文作者

Guan, Weilong, Yang, Kaihan, Chen, Yinsheng, Guan, Zhong

论文摘要

物理信息的神经网络(PINN)由于其在求解部分微分方程(PDE)方面的有效性而广泛应用于不同的领域。但是,对于科学和商业用途,需要大大提高PINN的准确性和效率。为了解决这个问题,我们系统地提出了一个新颖的尺寸增强物理信息的神经网络(DAPINN),该神经网络同时并显着提高了Pinn的准确性和效率。在Dapinn模型中,我们引入了神经网络中的归纳偏置,以通过为损失函数添加特殊的正则化项来增强网络的推广性。此外,我们通过插入其他示例特征并将扩展的维度纳入损耗函数来操纵网络输入维度。此外,我们在前进和向后问题上验证了功率系列增强,傅立叶系列增强和副本增强的有效性。在大多数实验中,Dapinn的误差为1 $ \ sim $ 2比PINN的误差。结果表明,Dapinn以准确性和效率均优于原始PINN,而对样品点数量的依赖性降低。我们还讨论了Dapinn的复杂性及其与其他方法的兼容性。

Physics-informed neural networks (PINNs) have been widely applied in different fields due to their effectiveness in solving partial differential equations (PDEs). However, the accuracy and efficiency of PINNs need to be considerably improved for scientific and commercial use. To address this issue, we systematically propose a novel dimension-augmented physics-informed neural network (DaPINN), which simultaneously and significantly improves the accuracy and efficiency of the PINN. In the DaPINN model, we introduce inductive bias in the neural network to enhance network generalizability by adding a special regularization term to the loss function. Furthermore, we manipulate the network input dimension by inserting additional sample features and incorporating the expanded dimensionality in the loss function. Moreover, we verify the effectiveness of power series augmentation, Fourier series augmentation and replica augmentation, in both forward and backward problems. In most experiments, the error of DaPINN is 1$\sim$2 orders of magnitude lower than that of PINN. The results show that the DaPINN outperforms the original PINN in terms of both accuracy and efficiency with a reduced dependence on the number of sample points. We also discuss the complexity of the DaPINN and its compatibility with other methods.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源