论文标题

部分可观测时空混沌系统的无模型预测

Generalised Latent Assimilation in Heterogeneous Reduced Spaces with Machine Learning Surrogate Models

论文作者

Cheng, Sibo, Chen, Jianhua, Anastasiou, Charitos, Angeli, Panagiota, Matar, Omar K., Guo, Yi-Ke, Pain, Christopher C., Arcucci, Rossella

论文摘要

使用机器学习算法生成的减少阶建模和低维替代模型已被广泛应用于高维动力系统中,以提高算法效率。在本文中,我们开发了一个系统,该系统结合了减少阶替代模型以及一种新的数据同化(DA)技术,用于合并来自不同物理空间的实时观察结果。我们利用局部平滑的替代功能,这些功能将编码系统变量的空间和当前观察结果链接起来,以执行较低的计算成本进行变分DA。名为广义潜在同化的新系统可以使降低订单建模和数据同化准确性提供的效率受益。本文还提供了对替代物和原始同化成本函数之间差异的理论分析,在本文中,根据本地训练集的大小,上限。对新方法进行了针对高维的CFD应用两相液体流,并具有非线性观察算子,目前的潜在同化方法无法处理。数值结果表明,提出的同化方法可以显着提高深度学习替代模型的重建和预测准确性,该模型比CFD模拟快了近1000倍。

Reduced-order modelling and low-dimensional surrogate models generated using machine learning algorithms have been widely applied in high-dimensional dynamical systems to improve the algorithmic efficiency. In this paper, we develop a system which combines reduced-order surrogate models with a novel data assimilation (DA) technique used to incorporate real-time observations from different physical spaces. We make use of local smooth surrogate functions which link the space of encoded system variables and the one of current observations to perform variational DA with a low computational cost. The new system, named Generalised Latent Assimilation can benefit both the efficiency provided by the reduced-order modelling and the accuracy of data assimilation. A theoretical analysis of the difference between surrogate and original assimilation cost function is also provided in this paper where an upper bound, depending on the size of the local training set, is given. The new approach is tested on a high-dimensional CFD application of a two-phase liquid flow with non-linear observation operators that current Latent Assimilation methods can not handle. Numerical results demonstrate that the proposed assimilation approach can significantly improve the reconstruction and prediction accuracy of the deep learning surrogate model which is nearly 1000 times faster than the CFD simulation.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源