论文标题

过渡关系意识到基于会话建议的自我注意

Transition Relation Aware Self-Attention for Session-based Recommendation

论文作者

Zhu, Guanghui, Hou, Haojun, Chen, Jingfan, Yuan, Chunfeng, Huang, Yihua

论文摘要

基于会话的建议是现实世界中的一个具有挑战性的问题,例如电子商务,简短的视频平台和音乐平台,旨在根据匿名会话预测下一个点击动作。最近,图形神经网络(GNN)已成为基于会话建议的最新方法。但是,我们发现这些方法存在两个局限性。一个是项目过渡关系并未完全利用,因为这些关系没有明确建模。另一个是由于GNN的限制,无法有效捕获项目之间的远距离依赖性。为了解决上述问题,我们提出了一种基于会话建议的新方法,称为过渡关系意识自我注意力(TRASA)。具体而言,TRASA首先将会话转换为图形,然后通过门控复发单元作为其过渡关系编码项目之间的最短路径。然后,为了捕获远程依赖性,Trasa利用自我发项机制来建立任何两个项目之间的直接连接,而无需经过中间的连接。同样,在计算注意力评分时,过渡关系会明确合并。在三个现实单词数据集上进行的广泛实验表明,TRASA始终超过现有的最新方法。

Session-based recommendation is a challenging problem in the real-world scenes, e.g., ecommerce, short video platforms, and music platforms, which aims to predict the next click action based on the anonymous session. Recently, graph neural networks (GNNs) have emerged as the state-of-the-art methods for session-based recommendation. However, we find that there exist two limitations in these methods. One is the item transition relations are not fully exploited since the relations are not explicitly modeled. Another is the long-range dependencies between items can not be captured effectively due to the limitation of GNNs. To solve the above problems, we propose a novel approach for session-based recommendation, called Transition Relation Aware Self-Attention (TRASA). Specifically, TRASA first converts the session to a graph and then encodes the shortest path between items through the gated recurrent unit as their transition relation. Then, to capture the long-range dependencies, TRASA utilizes the self-attention mechanism to build the direct connection between any two items without going through intermediate ones. Also, the transition relations are incorporated explicitly when computing the attention scores. Extensive experiments on three real-word datasets demonstrate that TRASA outperforms the existing state-of-the-art methods consistently.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源