当前位置: X-MOL 学术Expert Syst. Appl. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
LCDFormer: Long-term correlations dual-graph transformer for traffic forecasting
Expert Systems with Applications ( IF 8.5 ) Pub Date : 2024-03-20 , DOI: 10.1016/j.eswa.2024.123721
Jiongbiao Cai , Chia-Hung Wang , Kun Hu

Traffic forecasting has always been a critical component of intelligent transportation systems. Due to the complexity of traffic prediction models, most research just only consider short-term historical data in the temporal dimension. However, learning temporal patterns necessitates the involvement of long-term historical data. Additionally, many models are limited in capturing spatial features by only considering short-distance spatial information connected to the target node. To solve these problems, we propose a dual-graph transformer, namely Long-term Correlations Dual-graph transFormer (LCDFormer), designed to capture long-term correlations and long-distance spatial correlations. It is entirely based on attention mechanisms, and as far as we know, there is limited research adopting this approach. Our work addresses this gap in the literature. In particular, we have devised a time aggregation method capable of consolidating long-term historical time series, concurrently addressing the impact of long-term temporal correlations while minimizing the influence of redundant data. Subsequently, we have introduced a novel spatio-temporal attention module that compresses spatial information to generate short-term input sequences while modeling dynamic long-range spatial correlations. We conducted extensive experiments with LCDFormer on five real-world traffic datasets. The results indicate that LCDFormer, considering long-term spatio-temporal correlations, is better able to learn the spatio-temporal patterns of traffic data. Compared to the current state-of-the-art baseline, our model has demonstrated outstanding predictive performance with a maximum improvement of 5.02% in mean absolute error, 4.33% in root mean square error and 7.32% in mean absolute percentage error. The source codes are available at: .

中文翻译:

LCDFormer:用于流量预测的长期相关双图变压器

交通预测一直是智能交通系统的重要组成部分。由于流量预测模型的复杂性,大多数研究仅考虑时间维度的短期历史数据。然而,学习时间模式需要长期历史数据的参与。此外,许多模型在捕获空间特征方面受到限制,仅考虑与目标节点连接的短距离空间信息。为了解决这些问题,我们提出了一种双图变压器,即长期相关性双图变压器(LCDFormer),旨在捕获长期相关性和长距离空间相关性。它完全基于注意力机制,据我们所知,采用这种方法的研究有限。我们的工作解决了文献中的这一空白。特别是,我们设计了一种时间聚合方法,能够合并长期历史时间序列,同时解决长期时间相关性的影响,同时最大限度地减少冗余数据的影响。随后,我们引入了一种新颖的时空注意力模块,该模块可压缩空间信息以生成短期输入序列,同时对动态远程空间相关性进行建模。我们使用 LCDFormer 对五个现实世界的流量数据集进行了广泛的实验。结果表明,考虑到长期时空相关性,LCDFormer 能够更好地学习流量数据的时空模式。与当前最先进的基线相比,我们的模型表现出了出色的预测性能,平均绝对误差最大改善了 5.02%,均方根误差最大改善了 4.33%,平均绝对百分比误差最大改善了 7.32%。源代码可在以下位置获得: 。
更新日期:2024-03-20
down
wechat
bug