TY - JOUR
T1 - A novel frequency sparse downsampling interaction transformer for wind power forecasting
AU - Wang, Hexian
AU - Guo, Dongjie
AU - Wang, Lingmei
AU - Zhou, Tongming
AU - Jia, Chengzhen
AU - Liu, Yushan
N1 - Publisher Copyright:
© 2025 Elsevier Ltd
PY - 2025/4/21
Y1 - 2025/4/21
N2 - Accurate wind power forecasting is essential for the safe operation of power systems and efficient electricity market dispatch. This paper proposes a novel neural network architecture tailored for wind power forecasting, leveraging the unique characteristics of wind power data. With the rapid development of neural networks, Transformer architectures based on multi-head attention have shown excellent performance in wind power forecasting. However, the permutation-invariant nature of the attention mechanism in Transformer makes it insensitive to the temporal order of wind power timeseries. Furthermore, wind power data exhibit significant volatility and pronounced high-frequency components. To address these challenges, this study first decomposes the wind power data into a periodic and a trend component. By leveraging the low-rank and sparse properties of the timeseries in the Fourier transform domain, we transform the periodic component into the frequency domain and propose a frequency-domain sparse attention mechanism. This mechanism effectively filters out the noise frequencies, reduces computational complexity, and addresses the order insensitivity issue inherent in the Transformers. For the trend component, a downsampling interactive learning algorithm is proposed, which effectively captures both short and long-term features of the component. In addition, to the best of our knowledge, this paper is the first study to apply the cutting-edge timeseries forecasting models, including Informer, Autoformer, iTransformer, TimesNet, and TimeMixer, to the domain of wind power forecasting. Comparative experiments conducted for three wind farms demonstrate that the proposed Frequency Sparse Downsampling Interaction Transformer consistently outperforms baseline models, reducing errors by 11.87 % in ultra-short-term and 5.97 % in short-term forecasting compared with the second-best model.
AB - Accurate wind power forecasting is essential for the safe operation of power systems and efficient electricity market dispatch. This paper proposes a novel neural network architecture tailored for wind power forecasting, leveraging the unique characteristics of wind power data. With the rapid development of neural networks, Transformer architectures based on multi-head attention have shown excellent performance in wind power forecasting. However, the permutation-invariant nature of the attention mechanism in Transformer makes it insensitive to the temporal order of wind power timeseries. Furthermore, wind power data exhibit significant volatility and pronounced high-frequency components. To address these challenges, this study first decomposes the wind power data into a periodic and a trend component. By leveraging the low-rank and sparse properties of the timeseries in the Fourier transform domain, we transform the periodic component into the frequency domain and propose a frequency-domain sparse attention mechanism. This mechanism effectively filters out the noise frequencies, reduces computational complexity, and addresses the order insensitivity issue inherent in the Transformers. For the trend component, a downsampling interactive learning algorithm is proposed, which effectively captures both short and long-term features of the component. In addition, to the best of our knowledge, this paper is the first study to apply the cutting-edge timeseries forecasting models, including Informer, Autoformer, iTransformer, TimesNet, and TimeMixer, to the domain of wind power forecasting. Comparative experiments conducted for three wind farms demonstrate that the proposed Frequency Sparse Downsampling Interaction Transformer consistently outperforms baseline models, reducing errors by 11.87 % in ultra-short-term and 5.97 % in short-term forecasting compared with the second-best model.
KW - Downsampling interactive learning
KW - Frequency sparse attention
KW - Transformers
KW - Wind power forecasting
UR - http://www.scopus.com/inward/record.url?scp=105002894618&partnerID=8YFLogxK
U2 - 10.1016/j.energy.2025.136199
DO - 10.1016/j.energy.2025.136199
M3 - Article
AN - SCOPUS:105002894618
SN - 0360-5442
VL - 326
JO - Energy
JF - Energy
M1 - 136199
ER -