Abstract
Domain decomposition methods (DDMs) are popular solvers for discretized systems of partial differential equations (PDEs), with one-level and multilevel variants. These solvers rely on several algorithmic and mathematical parameters, prescribing overlap, subdomain boundary conditions, and other properties of the DDM. While some work has been done on optimizing these parameters, it has mostly focused on the one-level setting or special cases such as structured-grid discretizations with regular subdomain construction. In this paper, we propose multigrid graph neural networks (MG-GNN), a novel GNN architecture for learning optimized parameters in two-level DDMs\@. We train MG-GNN using a new unsupervised loss function, enabling effective training on small problems that yields robust performance on unstructured grids that are orders of magnitude larger than those in the training set. We show that MG-GNN outperforms popular hierarchical graph network architectures for this optimization and that our proposed loss function is critical to achieving this improved performance.
Abstract (translated)
领域分解方法(DDMs)是解决分块方程组(PDEs)的一级别和多级别变体的常用求解器。这些求解器依赖于几个算法和数学参数,指定重叠、子域边界条件和其他DDM的特征。尽管有一些工作在优化这些参数上开展,但它主要关注一级别设置或特殊情况,如结构化网格的 regular 子域构造。在本文中,我们提出了多网格 graph 神经网络(MG-GNN),是一种新的 GNN 架构,用于在两个级别的 DDMs 中学习优化参数。我们使用新的无监督损失函数来训练 MG-GNN,使其能够有效地训练小型问题,并在比训练集更大的无组织网格上表现出稳健性能。我们表明,MG-GNN 在这种类型的优化中表现出色,而我们提出的损失函数是实现这种改进性能的关键。
URL
https://arxiv.org/abs/2301.11378