论文标题

使用复发图网络的学习点过程

Learning Point Processes using Recurrent Graph Network

论文作者

Dash, Saurabh, She, Xueyuan, Mukhopadhyay, Saibal

论文摘要

我们提出了一种新型的复发图网络(RGN)方法,用于通过学习潜在的复杂随机过程来预测离散标记的事件序列。使用点过程的框架,我们将标记的离散事件序列解释为各种唯一类型的不同序列的叠加。图网络的节点使用LSTM来合并过去的信息,而图形注意力网络(GAT网络)引入了强烈的电感偏见,以捕获这些不同类型的事件之间的相互作用。通过将自我注意力的机制从过去的事件中参加到活动类型上,我们可以将时间和空间复杂性从$ \ MATHCAL {O}(n^2)$(事件总数)降低到$ \ Mathcal {O}(| \ Mathcal {y Mathcal {y} |^2)$^2)$(事件类型)。实验表明,与最新的基于最先进的变压器架构相比,提出的方法可以提高对数类样,预测和拟合优度的性能。

We present a novel Recurrent Graph Network (RGN) approach for predicting discrete marked event sequences by learning the underlying complex stochastic process. Using the framework of Point Processes, we interpret a marked discrete event sequence as the superposition of different sequences each of a unique type. The nodes of the Graph Network use LSTM to incorporate past information whereas a Graph Attention Network (GAT Network) introduces strong inductive biases to capture the interaction between these different types of events. By changing the self-attention mechanism from attending over past events to attending over event types, we obtain a reduction in time and space complexity from $\mathcal{O}(N^2)$ (total number of events) to $\mathcal{O}(|\mathcal{Y}|^2)$ (number of event types). Experiments show that the proposed approach improves performance in log-likelihood, prediction and goodness-of-fit tasks with lower time and space complexity compared to state-of-the art Transformer based architectures.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源