TY - GEN
T1 - Time-Transformer
T2 - 2024 SIAM International Conference on Data Mining
AU - Liu, Yuansan
AU - Wijewickrema, Sudanthi
AU - Li, Ang
AU - Bester, Christofer
AU - O'Leary, Stephen
AU - Bailey, James
N1 - Publisher Copyright:
Copyright © 2024 by SIAM.
PY - 2024/4/11
Y1 - 2024/4/11
N2 - Generating time series data is a promising approach to address data deficiency problems. However, it is also challenging due to the complex temporal properties of time series data, including local correlations as well as global dependencies. Most existing generative models have failed to effectively learn both the local and global properties of time series data. To address this open problem, we propose a novel time series generative model named 'Time-Transformer AAE', which consists of an adversarial autoencoder (AAE) and a newly designed architecture named 'Time-Transformer' within the decoder. The Time-Transformer first simultaneously learns local and global features in a layer-wise parallel design, combining the abilities of Temporal Convolutional Networks and Transformer in extracting local features and global dependencies respectively. Second, a bidirectional cross attention is proposed to provide complementary guidance across the two branches and achieve proper fusion between local and global features. Experimental results demonstrate that our model can outperform existing state-of-the-art models in 5 out of 6 datasets, specifically on those with data containing both global and local properties. Furthermore, we highlight our model's ability to handle this kind of data via an artificial dataset. Finally, we show how our model performs when applied to a real-world problem: data augmentation to support learning with small datasets and imbalanced datasets.
AB - Generating time series data is a promising approach to address data deficiency problems. However, it is also challenging due to the complex temporal properties of time series data, including local correlations as well as global dependencies. Most existing generative models have failed to effectively learn both the local and global properties of time series data. To address this open problem, we propose a novel time series generative model named 'Time-Transformer AAE', which consists of an adversarial autoencoder (AAE) and a newly designed architecture named 'Time-Transformer' within the decoder. The Time-Transformer first simultaneously learns local and global features in a layer-wise parallel design, combining the abilities of Temporal Convolutional Networks and Transformer in extracting local features and global dependencies respectively. Second, a bidirectional cross attention is proposed to provide complementary guidance across the two branches and achieve proper fusion between local and global features. Experimental results demonstrate that our model can outperform existing state-of-the-art models in 5 out of 6 datasets, specifically on those with data containing both global and local properties. Furthermore, we highlight our model's ability to handle this kind of data via an artificial dataset. Finally, we show how our model performs when applied to a real-world problem: data augmentation to support learning with small datasets and imbalanced datasets.
KW - bidirectional cross-attention
KW - temporal convolutional networks
KW - time series generation
KW - transformer
UR - http://www.scopus.com/inward/record.url?scp=85193512641&partnerID=8YFLogxK
U2 - 10.1137/1.9781611978032.37
DO - 10.1137/1.9781611978032.37
M3 - Conference paper
AN - SCOPUS:85193512641
T3 - Proceedings of the 2024 SIAM International Conference on Data Mining, SDM 2024
SP - 325
EP - 333
BT - Proceedings of the 2024 SIAM International Conference on Data Mining, SDM 2024
A2 - Shekhar, Shashi
A2 - Papalexakis, Vagelis
A2 - Gao, Jing
A2 - Jiang, Zhe
A2 - Riondato, Matteo
PB - Society for Industrial and Applied Mathematics Publications
CY - USA
Y2 - 18 April 2024 through 20 April 2024
ER -