论文标题
部分可观测时空混沌系统的无模型预测
Task-specific Pre-training and Prompt Decomposition for Knowledge Graph Population with Language Models
论文作者
论文摘要
我们在ISWC 2022上介绍了一个具有语言模型的知识图种群的系统,该系统对知识库的构建(LM-KBC)挑战进行了评估。我们的系统涉及特定于任务的预训练,以改善蒙版对象令牌的LM表示,迅速分解候选对象的逐步分解,以促进候选对象进行更高的质量检索。我们的系统是基于Bert LM的LM-KBC挑战赛赛道1的赢家;它在挑战的隐藏测试集中获得了55.0%的F-1得分。
We present a system for knowledge graph population with Language Models, evaluated on the Knowledge Base Construction from Pre-trained Language Models (LM-KBC) challenge at ISWC 2022. Our system involves task-specific pre-training to improve LM representation of the masked object tokens, prompt decomposition for progressive generation of candidate objects, among other methods for higher-quality retrieval. Our system is the winner of track 1 of the LM-KBC challenge, based on BERT LM; it achieves 55.0% F-1 score on the hidden test set of the challenge.