论文标题

文档级关系提取的密集连接的纵横交错的关注网络

A Densely Connected Criss-Cross Attention Network for Document-level Relation Extraction

论文作者

Zhang, Liang, Cheng, Yidong

论文摘要

文档级关系提取(RE)旨在确定给定文档中两个实体之间的关系。与其句子级别的对应物相比,文档级别需要复杂的推理。 Previous research normally completed reasoning through information propagation on the mention-level or entity-level document-graph, but rarely considered reasoning at the entity-pair-level.In this paper, we propose a novel model, called Densely Connected Criss-Cross Attention Network (Dense-CCNet), for document-level RE, which can complete logical reasoning at the entity-pair-level.具体而言,密集CCNET通过纵横交错的注意(CCA)执行实体对级逻辑推理,该推流可以在实体对矩阵上以水平和垂直方向收集上下文信息,以增强相应的实体对代表。此外,我们密切连接CCA的多层,以同时捕获单跳和多跳逻辑推理的功能。我们在三个公共文档级RE数据集,DOCRED,CDR和GDA上评估了我们的密集CCCNET模型。实验结果表明,我们的模型在这三个数据集上实现了最先进的性能。

Document-level relation extraction (RE) aims to identify relations between two entities in a given document. Compared with its sentence-level counterpart, document-level RE requires complex reasoning. Previous research normally completed reasoning through information propagation on the mention-level or entity-level document-graph, but rarely considered reasoning at the entity-pair-level.In this paper, we propose a novel model, called Densely Connected Criss-Cross Attention Network (Dense-CCNet), for document-level RE, which can complete logical reasoning at the entity-pair-level. Specifically, the Dense-CCNet performs entity-pair-level logical reasoning through the Criss-Cross Attention (CCA), which can collect contextual information in horizontal and vertical directions on the entity-pair matrix to enhance the corresponding entity-pair representation. In addition, we densely connect multiple layers of the CCA to simultaneously capture the features of single-hop and multi-hop logical reasoning.We evaluate our Dense-CCNet model on three public document-level RE datasets, DocRED, CDR, and GDA. Experimental results demonstrate that our model achieves state-of-the-art performance on these three datasets.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源