论文标题

建模文档互动以学习与正则自我发挥作用

Modeling Document Interactions for Learning to Rank with Regularized Self-Attention

论文作者

Sun, Shuo, Duh, Kevin

论文摘要

学习排名是一项重要的任务,已成功部署在许多现实世界信息检索系统中。大多数现有方法可以独立计算文档的相关性判断,而无需整体考虑整个竞争文件。在本文中,我们探讨了建模文档与基于自我注意的神经网络的相互作用。尽管自我发挥的网络已经实现了许多NLP任务的最先进的结果,但我们从经验上发现,自我注意力比基线神经学习几乎没有利益以对体系结构进行排名。为了改善自我发挥权重的学习,我们提出了旨在模拟文档之间相互作用的简单而有效的正则化术语。对排名公共学习的评估(LETOR)数据集表明,使用我们建议的正规化术语培训自我发挥的网络可以极大地超过现有的学习来对方法进行排名。

Learning to rank is an important task that has been successfully deployed in many real-world information retrieval systems. Most existing methods compute relevance judgments of documents independently, without holistically considering the entire set of competing documents. In this paper, we explore modeling documents interactions with self-attention based neural networks. Although self-attention networks have achieved state-of-the-art results in many NLP tasks, we find empirically that self-attention provides little benefit over baseline neural learning to rank architecture. To improve the learning of self-attention weights, We propose simple yet effective regularization terms designed to model interactions between documents. Evaluations on publicly available Learning to Rank (LETOR) datasets show that training self-attention network with our proposed regularization terms can significantly outperform existing learning to rank methods.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源