论文标题

用深神网络预测时间集

Predicting Temporal Sets with Deep Neural Networks

论文作者

Yu, Le, Sun, Leilei, Du, Bowen, Liu, Chuanren, Xiong, Hui, Lv, Weifeng

论文摘要

在一系列集合中,每个集合包含任意数量的元素,时间集的问题预测旨在预测后续集合中的元素。在实践中,时间集预测比时间事件和时间序列的预测建模要复杂得多,并且仍然是一个开放的问题。如果适用于时间集预测的问题,许多现有的方法通常会通过首先将时间集投影到潜在表示中,然后学习具有潜在表示的预测模型来遵循两步策略。两步方法通常会导致信息丢失和预测性能不令人满意。在本文中,我们提出了基于时间集预测的深神经网络的集成解决方案。我们方法的独特视角是通过构建设置级别的共发生图来学习元素关系,然后在动态关系图上执行图形卷积。此外,我们设计了一个基于注意力的模块,以适应学习元素和集合的时间依赖性。最后,我们提供了一个封闭式的更新机制,以在不同序列中找到隐藏的共享模式,并融合静态和动态信息以改善预测性能。对现实世界数据集的实验表明,即使有一部分培训数据,我们的方法也可以实现竞争性能,并且可以超过现有的方法。

Given a sequence of sets, where each set contains an arbitrary number of elements, the problem of temporal sets prediction aims to predict the elements in the subsequent set. In practice, temporal sets prediction is much more complex than predictive modelling of temporal events and time series, and is still an open problem. Many possible existing methods, if adapted for the problem of temporal sets prediction, usually follow a two-step strategy by first projecting temporal sets into latent representations and then learning a predictive model with the latent representations. The two-step approach often leads to information loss and unsatisfactory prediction performance. In this paper, we propose an integrated solution based on the deep neural networks for temporal sets prediction. A unique perspective of our approach is to learn element relationship by constructing set-level co-occurrence graph and then perform graph convolutions on the dynamic relationship graphs. Moreover, we design an attention-based module to adaptively learn the temporal dependency of elements and sets. Finally, we provide a gated updating mechanism to find the hidden shared patterns in different sequences and fuse both static and dynamic information to improve the prediction performance. Experiments on real-world data sets demonstrate that our approach can achieve competitive performances even with a portion of the training data and can outperform existing methods with a significant margin.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源