论文标题
星:通过缩放,翻译和旋转嵌入知识图
STaR: Knowledge Graph Embedding by Scaling, Translation and Rotation
论文作者
论文摘要
双线性方法在知识图嵌入(KGE)中是主流,旨在学习实体和知识图(kg)中的关系的低维表示和完全缺失的链接。现有的大多数作品都是找到关系之间的模式,并有效地建模它们以完成此任务。先前的作品主要发现了6种重要模式,例如非交换性。尽管某些双线性方法成功地建模了这些模式,但他们同时忽略了处理1-to-n,n-to-1和n-to-n关系(或复杂关系),这会损害其表现力。为此,我们整合了缩放,分别可以解决复杂关系和模式的翻译和旋转的组合,其中缩放是投影的简化。因此,我们提出了一个相应的双线性模型缩放平移和旋转(星形),由上述两个部分组成。此外,由于无法将翻译直接合并到双线性模型中,因此我们将翻译矩阵作为等效物。理论分析证明,Star能够同时建模所有模式并处理复杂关系,并且实验证明了其对链接预测常用基准的有效性。
The bilinear method is mainstream in Knowledge Graph Embedding (KGE), aiming to learn low-dimensional representations for entities and relations in Knowledge Graph (KG) and complete missing links. Most of the existing works are to find patterns between relationships and effectively model them to accomplish this task. Previous works have mainly discovered 6 important patterns like non-commutativity. Although some bilinear methods succeed in modeling these patterns, they neglect to handle 1-to-N, N-to-1, and N-to-N relations (or complex relations) concurrently, which hurts their expressiveness. To this end, we integrate scaling, the combination of translation and rotation that can solve complex relations and patterns, respectively, where scaling is a simplification of projection. Therefore, we propose a corresponding bilinear model Scaling Translation and Rotation (STaR) consisting of the above two parts. Besides, since translation cannot be incorporated into the bilinear model directly, we introduce translation matrix as the equivalent. Theoretical analysis proves that STaR is capable of modeling all patterns and handling complex relations simultaneously, and experiments demonstrate its effectiveness on commonly used benchmarks for link prediction.