论文标题

在域分解求解器中学习接口条件

Learning Interface Conditions in Domain Decomposition Solvers

论文作者

Taghibakhshi, Ali, Nytko, Nicolas, Zaman, Tareq, MacLachlan, Scott, Olson, Luke, West, Matthew

论文摘要

域分解方法在偏微分方程的溶液近似中被广泛使用且有效。然而,这些方法的最佳构造需要乏味的分析,并且通常仅在简化的结构化网格设置中可用,从而限制了它们用于更复杂的问题。在这项工作中,我们使用图形卷积神经网络(GCNN)和无监督的学习学习在子域接口上学习最佳修改。我们方法中的关键因素是改善的损失功能,可以在相对较小的问题上进行有效的培训,但在任意大问题上的表现良好,并且计算成本在问题大小上是线性的。对于结构化和非结构化网格问题,将学习线性求解器的性能与经典和优化的域分解算法进行了比较。

Domain decomposition methods are widely used and effective in the approximation of solutions to partial differential equations. Yet the optimal construction of these methods requires tedious analysis and is often available only in simplified, structured-grid settings, limiting their use for more complex problems. In this work, we generalize optimized Schwarz domain decomposition methods to unstructured-grid problems, using Graph Convolutional Neural Networks (GCNNs) and unsupervised learning to learn optimal modifications at subdomain interfaces. A key ingredient in our approach is an improved loss function, enabling effective training on relatively small problems, but robust performance on arbitrarily large problems, with computational cost linear in problem size. The performance of the learned linear solvers is compared with both classical and optimized domain decomposition algorithms, for both structured- and unstructured-grid problems.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源