当前位置: X-MOL 学术Knowl. Inf. Syst. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
GTHP: a novel graph transformer Hawkes process for spatiotemporal event prediction
Knowledge and Information Systems ( IF 2.7 ) Pub Date : 2024-03-19 , DOI: 10.1007/s10115-024-02080-z
Yiman Xie , Jianbin Wu , Yan Zhou

Abstract

The event sequences with spatiotemporal characteristics have been rapidly produced in various domains, such as earthquakes in seismology, electronic medical records in healthcare, and transactions in the financial market. These data often continue for weeks, months, or years, and the past events may trigger subsequent events. In this context, modeling the spatiotemporal event sequences and forecasting the next event has become a hot topic. However, existing models either failed to capture the long-term temporal dependencies or ignored the essential spatial information between sequences. In this paper, we proposed a novel graph transformer Hawkes process (GTHP) model to capture the long-term temporal dependencies and spatial information from historical events. The core concept of GTHP is to learn the spatial information by graph convolutional neural networks and capture long-term temporal dependencies from events embedding by self-attention mechanism. Moreover, we integrated the learned spatial information into the event embedding as auxiliary information. Numerous experiments on synthetic and real-world datasets proved the effectiveness of the proposed model.



中文翻译:

GTHP:一种用于时空事件预测的新型图形转换器霍克斯过程

摘要

具有时空特征的事件序列在各个领域快速产生,例如地震学中的地震、医疗保健中的电子病历以及金融市场中的交易。这些数据通常会持续数周、数月或数年,过去的事件可能会触发后续事件。在此背景下,时空事件序列建模并预测下一个事件已成为热门话题。然而,现有模型要么未能捕获长期的时间依赖性,要么忽略了序列之间的基本空间信息。在本文中,我们提出了一种新颖的图转换器霍克斯过程(GTHP)模型来捕获历史事件的长期时间依赖性和空间信息。GTHP的核心概念是通过图卷积神经网络学习空间信息,并通过自注意力机制从事件嵌入中捕获长期时间依赖性。此外,我们将学习到的空间信息作为辅助信息集成到事件嵌入中。对合成数据集和真实数据集的大量实验证明了所提出模型的有效性。

更新日期:2024-03-20
down
wechat
bug