论文标题
时间知识图的一声学习
One-shot Learning for Temporal Knowledge Graphs
论文作者
论文摘要
大多数现实世界知识图的特征是长尾关系频率分布,其中很大一部分关系仅发生几次。该观察结果引起了人们对低射击学习方法的最新兴趣,这些方法只能从几个示例中概括。但是,现有的方法是根据静态知识图量身定制的,并且不容易推广到时间设置,在这种情况下,数据稀缺会带来更大的问题,例如由于出现了新的,以前看不见的关系。我们通过为时间知识图中的链接预测提出一个单次学习框架来解决这一缺点。我们提出的方法采用自我注意的机制来有效地编码实体之间的时间相互作用,以及一个网络,以计算给定查询和(一次性)示例之间的相似性分数。我们的实验表明,所提出的算法的表现优于两个良好的基准测试的最先进的基线,同时在稀疏关系方面取得了更好的性能。
Most real-world knowledge graphs are characterized by a long-tail relation frequency distribution where a significant fraction of relations occurs only a handful of times. This observation has given rise to recent interest in low-shot learning methods that are able to generalize from only a few examples. The existing approaches, however, are tailored to static knowledge graphs and not easily generalized to temporal settings, where data scarcity poses even bigger problems, e.g., due to occurrence of new, previously unseen relations. We address this shortcoming by proposing a one-shot learning framework for link prediction in temporal knowledge graphs. Our proposed method employs a self-attention mechanism to effectively encode temporal interactions between entities, and a network to compute a similarity score between a given query and a (one-shot) example. Our experiments show that the proposed algorithm outperforms the state of the art baselines for two well-studied benchmarks while achieving significantly better performance for sparse relations.