RFNet: Multivariate long sequence time-series forecasting based on recurrent representation and feature enhancement.

Dandan Zhang, Zhiqiang Zhang, Nanguang Chen, Yun Wang
Author Information
  1. Dandan Zhang: School of Computer Science and Engineering, Southeast University, Nanjing, China.
  2. Zhiqiang Zhang: School of Computer Science and Engineering, Southeast University, Nanjing, China.
  3. Nanguang Chen: College of Design and Engineering, National University of Singapore, Singapore.
  4. Yun Wang: School of Computer Science and Engineering, Southeast University, Nanjing, China. Electronic address: ywang_cse@seu.edu.cn.

Abstract

Multivariate time series exhibit complex patterns and structures involving interactions among multiple variables and long-term temporal dependencies, making multivariate long sequence time series forecasting (MLSTF) exceptionally challenging. Despite significant progress in Transformer-based methods in the MLSTF domain, many models still rely on stacked encoder-decoder architectures to capture complex time series patterns. This leads to increased computational complexity and overlooks spatial pattern information in multivariate time series, thereby limiting the model's performance. To address these challenges, we propose RFNet, a lightweight model based on recurrent representation and feature enhancement. We partition the time series into fixed-size subsequences to retain local contextual temporal pattern information and cross-variable spatial pattern information. The recurrent representation module employs gate attention mechanisms and memory units to capture local information of the subsequences and obtain long-term correlation information of the input sequence by integrating information from different memory units. Meanwhile, we utilize a shared multi-layer perceptron (MLP) to capture global pattern information of the input sequence. The feature enhancement module explicitly extracts complex spatial patterns in the time series by transforming the input sequence. We validate the performance of RFNet on ten real-world datasets. The results demonstrate an improvement of approximately 55.3% over state-of-the-art MLSTF models, highlighting its significant advantage in addressing multivariate long sequence time series forecasting problems.

Keywords

MeSH Term

Neural Networks, Computer
Forecasting
Multivariate Analysis
Time Factors
Humans
Algorithms

Word Cloud

Created with Highcharts 10.0.0timeseriesinformationsequenceforecastingpatternrepresentationMultivariatecomplexpatternsmultivariatelongMLSTFcapturespatialrecurrentfeatureenhancementinputlong-termtemporalsignificantmodelsperformanceRFNetbasedsubsequenceslocalmoduleattentionmemoryunitsexhibitstructuresinvolvinginteractionsamongmultiplevariablesdependenciesmakingexceptionallychallengingDespiteprogressTransformer-basedmethodsdomainmanystillrelystackedencoder-decoderarchitecturesleadsincreasedcomputationalcomplexityoverlookstherebylimitingmodel'saddresschallengesproposelightweightmodelpartitionfixed-sizeretaincontextualcross-variableemploysgatemechanismsobtaincorrelationintegratingdifferentMeanwhileutilizesharedmulti-layerperceptronMLPglobalexplicitlyextractstransformingvalidatetenreal-worlddatasetsresultsdemonstrateimprovementapproximately553%state-of-the-arthighlightingadvantageaddressingproblemsRFNet:time-seriesGateRecurrentTime-series

Similar Articles

Cited By

No available data.