Short-Term Wind Power Prediction Based on Encoder-Decoder Network and Multi-Point Focused Linear Attention Mechanism.

Jinlong Mei, Chengqun Wang, Shuyun Luo, Weiqiang Xu, Zhijiang Deng
Author Information
  1. Jinlong Mei: School of Computer Science and Technology, Zhejiang Sci-Tech University, Hangzhou 310018, China. ORCID
  2. Chengqun Wang: Key Laboratory of Intelligent Textile and Flexible Interconnection of Zhejiang Province, Zhejiang Sci-Tech University, Hangzhou 310018, China.
  3. Shuyun Luo: Key Laboratory of Intelligent Textile and Flexible Interconnection of Zhejiang Province, Zhejiang Sci-Tech University, Hangzhou 310018, China. ORCID
  4. Weiqiang Xu: Key Laboratory of Intelligent Textile and Flexible Interconnection of Zhejiang Province, Zhejiang Sci-Tech University, Hangzhou 310018, China. ORCID
  5. Zhijiang Deng: Fox-Ess, Co., Ltd., Wenzhou 325024, China.

Abstract

Wind energy is a clean energy source that is characterised by significant uncertainty. The electricity generated from wind power also exhibits strong unpredictability, which when integrated can have a substantial impact on the security of the power grid. In the context of integrating wind power into the grid, accurate prediction of wind power generation is crucial in order to minimise damage to the grid system. This paper proposes a novel composite model (MLL-MPFLA) that combines a multilayer perceptron (MLP) and an LSTM-based encoder-decoder network for short-term prediction of wind power generation. In this model, the MLP first extracts multidimensional features from wind power data. Subsequently, an LSTM-based encoder-decoder network explores the temporal characteristics of the data in depth, combining multidimensional features and temporal features for effective prediction. During decoding, an improved focused linear attention mechanism called multi-point focused linear attention is employed. This mechanism enhances prediction accuracy by weighting predictions from different subspaces. A comparative analysis against the MLP, LSTM, LSTM-Attention-LSTM, LSTM-Self_Attention-LSTM, and CNN-LSTM-Attention models demonstrates that the proposed MLL-MPFLA model outperforms the others in terms of MAE, RMSE, MAPE, and R2, thereby validating its predictive performance.

Keywords

References

  1. IEEE Trans Sustain Energy. 2018 Jul;9(3):1437-1447 [PMID: 30405893]
  2. Sensors (Basel). 2024 Jan 11;24(2): [PMID: 38257537]
  3. Sensors (Basel). 2024 Apr 15;24(8): [PMID: 38676138]

Grants

  1. No.U1709219 and 61601410/National Natural Science Foundation of China
  2. No.2022C01079 and 2024C01060/Key Research and Development Program Foundation of Zhejiang

Word Cloud

Created with Highcharts 10.0.0powerwindpredictionnetworkgridmodelMLPfeaturesfocusedlinearattentionWindenergygenerationMLL-MPFLALSTM-basedencoder-decodershort-termmultidimensionaldatatemporalmechanismmulti-pointLSTMcleansourcecharacterisedsignificantuncertaintyelectricitygeneratedalsoexhibitsstrongunpredictabilityintegratedcansubstantialimpactsecuritycontextintegratingaccuratecrucialorderminimisedamagesystempaperproposesnovelcompositecombinesmultilayerperceptronfirstextractsSubsequentlyexplorescharacteristicsdepthcombiningeffectivedecodingimprovedcalledemployedenhancesaccuracyweightingpredictionsdifferentsubspacescomparativeanalysisLSTM-Attention-LSTMLSTM-Self_Attention-LSTMCNN-LSTM-AttentionmodelsdemonstratesproposedoutperformsotherstermsMAERMSEMAPER2therebyvalidatingpredictiveperformanceShort-TermPowerPredictionBasedEncoder-DecoderNetworkMulti-PointFocusedLinearAttentionMechanismencoder–decoder

Similar Articles

Cited By