论文标题
光谱图注意网络,具有快速特征 - 附属物
Spectral Graph Attention Network with Fast Eigen-approximation
论文作者
论文摘要
最近已经提出了用于表示学习的图形神经网络(GNN)的变体,并在各个领域取得了成果。其中,图形注意力网络(GAT)首先采用了自我发挥的策略来学习空间域中每个边缘的注意力权重。但是,在边缘上学习注意力只能关注图的本地信息,并大大提高计算成本。在本文中,我们首先在图形的光谱结构域和当前的光谱图注意网络(SPGAT)中介绍了注意机制,该频谱图网络(SPGAT)学习了有关加权滤镜和图形小波库的不同频率组件的表示形式。这样,SPGAT可以更好地以有效的方式捕获图形的全局模式,而学到的参数比GAT的参数少得多。此外,为了降低特征分类带来的SPGAT的计算成本,我们提出了快速近似变体SPGAT-CHEBY。我们彻底评估了SPGAT和SPGAT-CHEBY在半监督的节点分类任务中的性能,并验证了在光谱域中学习的注意力的有效性。
Variants of Graph Neural Networks (GNNs) for representation learning have been proposed recently and achieved fruitful results in various fields. Among them, Graph Attention Network (GAT) first employs a self-attention strategy to learn attention weights for each edge in the spatial domain. However, learning the attentions over edges can only focus on the local information of graphs and greatly increases the computational costs. In this paper, we first introduce the attention mechanism in the spectral domain of graphs and present Spectral Graph Attention Network (SpGAT) that learns representations for different frequency components regarding weighted filters and graph wavelets bases. In this way, SpGAT can better capture global patterns of graphs in an efficient manner with much fewer learned parameters than that of GAT. Further, to reduce the computational cost of SpGAT brought by the eigen-decomposition, we propose a fast approximation variant SpGAT-Cheby. We thoroughly evaluate the performance of SpGAT and SpGAT-Cheby in semi-supervised node classification tasks and verify the effectiveness of the learned attentions in the spectral domain.