论文标题

通过变压器将知识库与端到端的面向任务对话系统进行上下文化

Contextualize Knowledge Bases with Transformer for End-to-end Task-Oriented Dialogue Systems

论文作者

Gou, Yanjie, Lei, Yinjie, Liu, Lingqiao, Dai, Yong, Shen, Chunxu

论文摘要

将知识库(KB)纳入端到端任务对话系统很具有挑战性,因为它需要正确地表示KB的实体,这与其KB上下文和对话环境相关。现有的作品仅用感知其KB环境的一部分代表实体,这可能导致由于信息损失而导致效率较低的表示,并不利地支持KB推理和响应生成。为了解决这个问题,我们通过动态感知所有相关实体和对话历史来探索以完全背景化实体表示。为了实现这一目标,我们提出了一个上下文感知的内存增强的变压器框架(彗星),该框架将KB视为序列,并利用新颖的记忆掩码来强制执行该实体,以仅关注其相关实体和对话历史记录,同时避免对无关实体的分心。通过广泛的实验,我们表明我们的彗星框架可以在艺术状态下实现卓越的性能。

Incorporating knowledge bases (KB) into end-to-end task-oriented dialogue systems is challenging, since it requires to properly represent the entity of KB, which is associated with its KB context and dialogue context. The existing works represent the entity with only perceiving a part of its KB context, which can lead to the less effective representation due to the information loss, and adversely favor KB reasoning and response generation. To tackle this issue, we explore to fully contextualize the entity representation by dynamically perceiving all the relevant entities} and dialogue history. To achieve this, we propose a COntext-aware Memory Enhanced Transformer framework (COMET), which treats the KB as a sequence and leverages a novel Memory Mask to enforce the entity to only focus on its relevant entities and dialogue history, while avoiding the distraction from the irrelevant entities. Through extensive experiments, we show that our COMET framework can achieve superior performance over the state of the arts.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源