论文标题

PVGRU:通过伪变化机制产生多样化和相关的对话响应

PVGRU: Generating Diverse and Relevant Dialogue Responses via Pseudo-Variational Mechanism

论文作者

Liu, Yongkang, Feng, Shi, Wang, Daling, Zhang, Yifei, Schütze, Hinrich

论文摘要

我们研究基于生成的聊天机器人中多转向对话的响应生成。现有基于RNN(经常性神经网络)的生成模型通常采用最后一个隐藏状态来汇总序列,这使得模型无法捕获不同对话中观察到的细微可变性,并且无法区分组成中相似的对话之间的差异。在本文中,我们提出了一个伪变化的封闭式复发单元(PVGRU)组件,而没有后验知识,通过将复发性汇总变量引入GRU中,该变量可以汇总子序列的累积分布变化。 PVGRU可以通过总结由设计的分布一致性和重建目标优化的变量来感知细微的语义变异性。此外,我们构建了基于PVGRU的伪变量分层对话(PVHD)模型。实验结果表明,PVGRU可以广泛改善两个基准数据集上响应的多样性和相关性。

We investigate response generation for multi-turn dialogue in generative-based chatbots. Existing generative models based on RNNs (Recurrent Neural Networks) usually employ the last hidden state to summarize the sequences, which makes models unable to capture the subtle variability observed in different dialogues and cannot distinguish the differences between dialogues that are similar in composition. In this paper, we propose a Pseudo-Variational Gated Recurrent Unit (PVGRU) component without posterior knowledge through introducing a recurrent summarizing variable into the GRU, which can aggregate the accumulated distribution variations of subsequences. PVGRU can perceive the subtle semantic variability through summarizing variables that are optimized by the devised distribution consistency and reconstruction objectives. In addition, we build a Pseudo-Variational Hierarchical Dialogue (PVHD) model based on PVGRU. Experimental results demonstrate that PVGRU can broadly improve the diversity and relevance of responses on two benchmark datasets.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源