Bidirectional Spatial-Temporal Adaptive Transformer for Urban Traffic Flow Forecasting

Changlu Chen, Yanbin Liu, Ling Chen, Chengqi Zhang

Research output: Contribution to journalArticlepeer-review

90 Citations (Scopus)

Abstract

Urban traffic forecasting is the cornerstone of the intelligent transportation system (ITS). Existing methods focus on spatial-temporal dependency modeling, while two intrinsic properties of the traffic forecasting problem are overlooked. First, the complexity of diverse forecasting tasks is nonuniformly distributed across various spaces (e.g., suburb versus downtown) and times (e.g., rush hour versus off-peak). Second, the recollection of past traffic conditions is beneficial to the prediction of future traffic conditions. Based on these properties, we propose a bidirectional spatial-temporal adaptive transformer (Bi-STAT) for accurate traffic forecasting. Bi-STAT adopts an encoder&#x2013;decoder architecture, where both the encoder and the decoder maintain a spatial-adaptive transformer and a temporal-adaptive transformer structure. Inspired by the first property, each transformer is designed to dynamically process the traffic streams according to their task complexities. Specifically, we realize this by the recurrent mechanism with a novel dynamic halting module (DHM). Each transformer performs iterative computation with shared parameters until DHM emits a stopping signal. Motivated by the second property, Bi-STAT utilizes one decoder to perform the present <inline-formula> <tex-math notation="LaTeX">$\rightarrow$</tex-math> </inline-formula> past recollection task and the other decoder to perform the present <inline-formula> <tex-math notation="LaTeX">$\rightarrow$</tex-math> </inline-formula> future prediction task. The recollection task supplies complementary information to assist and regularize the prediction task for a better generalization. Through extensive experiments, we show the effectiveness of each module in Bi-STAT and demonstrate the superiority of Bi-STAT over the state-of-the-art baselines on four benchmark datasets. The code is available at https://github.com/chenchl19941118/Bi-STAT.git.

Original languageEnglish
Pages (from-to)6913-6925
Number of pages14
JournalIEEE Transactions on Neural Networks and Learning Systems
Volume34
Issue number10
Early online date30 Jun 2022
DOIs
Publication statusPublished - 1 Oct 2023

Fingerprint

Dive into the research topics of 'Bidirectional Spatial-Temporal Adaptive Transformer for Urban Traffic Flow Forecasting'. Together they form a unique fingerprint.

Cite this