论文标题

面对变化:不断增长的知识图的持续实体对齐

Facing Changes: Continual Entity Alignment for Growing Knowledge Graphs

论文作者

Wang, Yuxin, Cui, Yuanning, Liu, Wenqiang, Sun, Zequn, Jiang, Yiqiao, Han, Kexin, Hu, Wei

论文摘要

实体对齐是知识图(kg)集成中的基本且至关重要的技术。多年来,对实体一致性的研究一直存在于KG是静态的假设,该假设忽略了现实世界KG的生长的性质。随着KG的成长,以前的一致性结果面临需要重新审视的,而新实体对齐等待被发现。在本文中,我们建议并深入研究现实但未开发的设置,称为持续实体对齐。为了避免在新的实体和三元组来时对整个KGS进行整个模型,我们为此任务提供了一种持续的对齐方法。它基于实体邻接来重建实体的表示,使其能够使用其现有邻居快速而有归纳的新实体生成嵌入。它选择并重播部分预先对准的实体对,仅训练KG的一部分,同时提取可信赖的知识增强对准。由于不可避免地会包含与以前的工作不同的不可匹配的实体,因此所提出的方法采用双向最近的邻居匹配来找到新的实体对齐并更新旧的对齐。此外,我们还通过模拟多语言DBPEDIA的增长来构建新数据集。广泛的实验表明,我们的持续比对方法比基于再培训或归纳学习的基准更有效。

Entity alignment is a basic and vital technique in knowledge graph (KG) integration. Over the years, research on entity alignment has resided on the assumption that KGs are static, which neglects the nature of growth of real-world KGs. As KGs grow, previous alignment results face the need to be revisited while new entity alignment waits to be discovered. In this paper, we propose and dive into a realistic yet unexplored setting, referred to as continual entity alignment. To avoid retraining an entire model on the whole KGs whenever new entities and triples come, we present a continual alignment method for this task. It reconstructs an entity's representation based on entity adjacency, enabling it to generate embeddings for new entities quickly and inductively using their existing neighbors. It selects and replays partial pre-aligned entity pairs to train only parts of KGs while extracting trustworthy alignment for knowledge augmentation. As growing KGs inevitably contain non-matchable entities, different from previous works, the proposed method employs bidirectional nearest neighbor matching to find new entity alignment and update old alignment. Furthermore, we also construct new datasets by simulating the growth of multilingual DBpedia. Extensive experiments demonstrate that our continual alignment method is more effective than baselines based on retraining or inductive learning.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源