论文标题

图形神经网络的图形自适应整流线性单元

Graph-adaptive Rectified Linear Unit for Graph Neural Networks

论文作者

Zhang, Yifei, Zhu, Hao, Meng, Ziqiao, Koniusz, Piotr, King, Irwin

论文摘要

图形神经网络(GNN)通过将传统卷积扩展到非欧盟数据的学习来取得了显着的成功。 GNN的关键是采用神经消息范式,具有两个阶段:聚合和更新。 GNN的当前设计考虑了聚合阶段的拓扑信息。但是,在更新阶段,所有节点都共享相同的更新功能。相同的更新功能将每个节点嵌入为I.I.D.随机变量,因此忽略了社区之间的隐式关系,这限制了GNN的能力。更新函数通常通过线性转换实现,然后使用非线性激活函数。为了使更新函数拓扑感知,我们将拓扑信息注入非线性激活函数,并提出图形自适应的整流线性单元(GRELU),这是一种新的参数激活函数,以新颖而有效的方式结合了邻里信息。 GRELU的参数是从基于节点特征和相应的相邻矩阵的高功能获得的。为了降低过度拟合和计算成本的风险,我们分别将功能分别为节点和功能的两个独立组成部分。我们进行全面的实验,以表明我们的插件Grelu方法在不同的GNN骨架和各种下游任务的情况下是高效有效的。

Graph Neural Networks (GNNs) have achieved remarkable success by extending traditional convolution to learning on non-Euclidean data. The key to the GNNs is adopting the neural message-passing paradigm with two stages: aggregation and update. The current design of GNNs considers the topology information in the aggregation stage. However, in the updating stage, all nodes share the same updating function. The identical updating function treats each node embedding as i.i.d. random variables and thus ignores the implicit relationships between neighborhoods, which limits the capacity of the GNNs. The updating function is usually implemented with a linear transformation followed by a non-linear activation function. To make the updating function topology-aware, we inject the topological information into the non-linear activation function and propose Graph-adaptive Rectified Linear Unit (GReLU), which is a new parametric activation function incorporating the neighborhood information in a novel and efficient way. The parameters of GReLU are obtained from a hyperfunction based on both node features and the corresponding adjacent matrix. To reduce the risk of overfitting and the computational cost, we decompose the hyperfunction as two independent components for nodes and features respectively. We conduct comprehensive experiments to show that our plug-and-play GReLU method is efficient and effective given different GNN backbones and various downstream tasks.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源