论文标题

差异神经时间点过程

Variational Neural Temporal Point Process

论文作者

Eom, Deokjun, Lee, Sehyun, Choi, Jaesik

论文摘要

时间点过程是一个随机过程,可以预测可能发生哪种类型的事件,以及在事件的历史记录的情况下发生的何时发生。在日常生活中有各种例子的发生动态,训练时间动态并解决两个不同的预测问题,时间和类型预测很重要。特别是,基于神经网络的深度模型的表现优于统计模型,例如霍克斯过程和泊松过程。但是,许多现有的方法过于适应特定事件,而不是学习和预测各种事件类型。因此,这种方法无法应对事件之间的修改关系,也无法很好地预测时间点过程的强度功能。在本文中,为了解决这些问题,我们提出了一个变分神经时间点过程(VNTPP)。我们介绍了推理和生成网络,并训练潜在变量的分布,以处理深神经网络上的随机属性。强度函数是使用潜在变量的分布来计算的,因此我们可以更准确地预测事件类型和事件的到达时间。我们从经验上证明,我们的模型可以概括各种事件类型的表示。此外,我们在定量和定性上显示我们的模型优于基于神经网络的其他深度网络模型以及关于合成和现实世界数据集的统计过程。

A temporal point process is a stochastic process that predicts which type of events is likely to happen and when the event will occur given a history of a sequence of events. There are various examples of occurrence dynamics in the daily life, and it is important to train the temporal dynamics and solve two different prediction problems, time and type predictions. Especially, deep neural network based models have outperformed the statistical models, such as Hawkes processes and Poisson processes. However, many existing approaches overfit to specific events, instead of learning and predicting various event types. Therefore, such approaches could not cope with the modified relationships between events and fail to predict the intensity functions of temporal point processes very well. In this paper, to solve these problems, we propose a variational neural temporal point process (VNTPP). We introduce the inference and the generative networks, and train a distribution of latent variable to deal with stochastic property on deep neural network. The intensity functions are computed using the distribution of latent variable so that we can predict event types and the arrival times of the events more accurately. We empirically demonstrate that our model can generalize the representations of various event types. Moreover, we show quantitatively and qualitatively that our model outperforms other deep neural network based models and statistical processes on synthetic and real-world datasets.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源