论文标题
伊甸园:插入式距离距离编码到1-WL测试之外
EDEN: A Plug-in Equivariant Distance Encoding to Beyond the 1-WL Test
论文作者
论文摘要
消息通讯方案是图表学习的核心。尽管大多数现有消息的图形神经网络(MPNN)在节点和边缘级表示学习中的图形表示学习和排列量表中是置换不变的,但它们的表现力通常受1-WeeSisfeiler-Lehman-Lehman(1-WL)图形质量质量测试的限制。最近提出的具有特殊设计的复杂消息通讯机制的表达式图神经网络(GNN)是不切实际的。为了弥合差距,我们为MPNN提出了一个插入式距离距离编码(EDEN)。伊甸园源自该图的距离矩阵上的一系列可解释的变换。从理论上讲,我们证明了伊甸园对所有级别的图表学习是置换式的,我们从经验上说明伊甸园的表现力可以达到3-WL测试。对现实世界数据集的广泛实验表明,将伊甸园与常规GNN结合使用,超过了最近的高级GNN。
The message-passing scheme is the core of graph representation learning. While most existing message-passing graph neural networks (MPNNs) are permutation-invariant in graph-level representation learning and permutation-equivariant in node- and edge-level representation learning, their expressive power is commonly limited by the 1-Weisfeiler-Lehman (1-WL) graph isomorphism test. Recently proposed expressive graph neural networks (GNNs) with specially designed complex message-passing mechanisms are not practical. To bridge the gap, we propose a plug-in Equivariant Distance ENcoding (EDEN) for MPNNs. EDEN is derived from a series of interpretable transformations on the graph's distance matrix. We theoretically prove that EDEN is permutation-equivariant for all level graph representation learning, and we empirically illustrate that EDEN's expressive power can reach up to the 3-WL test. Extensive experiments on real-world datasets show that combining EDEN with conventional GNNs surpasses recent advanced GNNs.