论文标题

Bertnet:从验证的语言模型中收获具有任意关系的知识图

BertNet: Harvesting Knowledge Graphs with Arbitrary Relations from Pretrained Language Models

论文作者

Hao, Shibo, Tan, Bowen, Tang, Kaiwen, Ni, Bin, Shao, Xiyan, Zhang, Hengzhe, Xing, Eric P., Hu, Zhiting

论文摘要

至关重要的是自动构建各种新关系的知识图(kgs),以支持知识发现和广泛的应用。以前基于众包或文本挖掘的KG施工方法通常仅限于由于手动成本或文本语料库限制而导致的一组预定义关系。最近的研究建议将验证的语言模型(LMS)用作具有提示知识查询的隐性知识基础。然而,隐性知识缺乏全面象征性kg的许多理想特性,例如易于访问,导航,编辑和质量保证。在本文中,我们提出了一种新的方法,以从验证的LMS中收集大量的任意关系。随着关系定义的最小输入(提示和一些示例实体对的镜头),该方法在庞大的实体对空间中有效地搜索,以提取对所需关系的多样化的准确知识。我们开发了一种有效的搜索和验证机制,以提高效率和准确性。我们部署了从不同LM的400多个新关系收获KGS的方法。广泛的人类和自动评估表明,我们的方法可以提取各种准确的知识,包括复杂关系的元素(例如,“ A具有但不擅长B”)。由此产生的KGS作为源LMS的象征性解释也揭示了对LMS知识能力的新见解。

It is crucial to automatically construct knowledge graphs (KGs) of diverse new relations to support knowledge discovery and broad applications. Previous KG construction methods, based on either crowdsourcing or text mining, are often limited to a small predefined set of relations due to manual cost or restrictions in text corpus. Recent research proposed to use pretrained language models (LMs) as implicit knowledge bases that accept knowledge queries with prompts. Yet, the implicit knowledge lacks many desirable properties of a full-scale symbolic KG, such as easy access, navigation, editing, and quality assurance. In this paper, we propose a new approach of harvesting massive KGs of arbitrary relations from pretrained LMs. With minimal input of a relation definition (a prompt and a few shot of example entity pairs), the approach efficiently searches in the vast entity pair space to extract diverse accurate knowledge of the desired relation. We develop an effective search-and-rescore mechanism for improved efficiency and accuracy. We deploy the approach to harvest KGs of over 400 new relations from different LMs. Extensive human and automatic evaluations show our approach manages to extract diverse accurate knowledge, including tuples of complex relations (e.g., "A is capable of but not good at B"). The resulting KGs as a symbolic interpretation of the source LMs also reveal new insights into the LMs' knowledge capacities.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源