Broad Echo State Network with Reservoir Pruning for Nonstationary Time Series Prediction.

Wenjie Liu, Yuting Bai, Xuebo Jin, Xiaoyi Wang, Tingli Su, Jianlei Kong
Author Information
  1. Wenjie Liu: School of Artificial Intelligence, Beijing Technology and Business University, Beijing 100048, China. ORCID
  2. Yuting Bai: School of Artificial Intelligence, Beijing Technology and Business University, Beijing 100048, China. ORCID
  3. Xuebo Jin: School of Artificial Intelligence, Beijing Technology and Business University, Beijing 100048, China. ORCID
  4. Xiaoyi Wang: School of Artificial Intelligence, Beijing Technology and Business University, Beijing 100048, China. ORCID
  5. Tingli Su: School of Artificial Intelligence, Beijing Technology and Business University, Beijing 100048, China. ORCID
  6. Jianlei Kong: School of Artificial Intelligence, Beijing Technology and Business University, Beijing 100048, China. ORCID

Abstract

The nonstationary time series is generated in various natural and man-made systems, of which the prediction is vital for advanced control and management. The neural networks have been explored in the time series prediction, but the problem remains in modeling the data's nonstationary and nonlinear features. Referring to the time series feature and network property, a novel network is designed with dynamic optimization of the model structure. Firstly, the echo state network (ESN) is introduced into the broad learning system (BLS). The broad echo state network (BESN) can increase the training efficiency with the incremental learning algorithm by removing the error backpropagation. Secondly, an optimization algorithm is proposed to reduce the redundant information in the training process of BESN units. The number of neurons in BESN with a fixed step size is pruned according to the contribution degree. Finally, the improved network is applied in the different datasets. The tests in the time series of natural and man-made systems prove that the proposed network performs better on the nonstationary time series prediction than the typical methods, including the ESN, BLS, and recurrent neural network.

References

  1. IEEE Trans Neural Netw Learn Syst. 2018 Jan;29(1):10-24 [PMID: 28742048]
  2. IEEE Trans Pattern Anal Mach Intell. 1982 Feb;4(2):124-8 [PMID: 21869015]
  3. Neural Comput. 2019 Jul;31(7):1235-1270 [PMID: 31113301]
  4. IEEE Trans Cybern. 2020 Aug;50(8):3668-3681 [PMID: 31751262]
  5. Comput Intell Neurosci. 2021 Nov 10;2021:1194565 [PMID: 34804137]
  6. PLoS One. 2018 Mar 27;13(3):e0194889 [PMID: 29584784]
  7. IEEE Trans Neural Netw. 2011 Jan;22(1):131-44 [PMID: 21075721]
  8. Neural Netw. 2007 Apr;20(3):335-52 [PMID: 17517495]
  9. Neural Netw. 2002 Sep;15(7):909-25 [PMID: 14672167]
  10. Int J Environ Res Public Health. 2019 Oct 09;16(20): [PMID: 31600885]

MeSH Term

Algorithms
Humans
Neural Networks, Computer
Neurons
Time Factors

Word Cloud

Created with Highcharts 10.0.0networktimeseriesnonstationarypredictionBESNnaturalman-madesystemsneuraloptimizationechostateESNbroadlearningBLStrainingalgorithmproposedgeneratedvariousvitaladvancedcontrolmanagementnetworksexploredproblemremainsmodelingdata'snonlinearfeaturesReferringfeaturepropertynoveldesigneddynamicmodelstructureFirstlyintroducedsystemcanincreaseefficiencyincrementalremovingerrorbackpropagationSecondlyreduceredundantinformationprocessunitsnumberneuronsfixedstepsizeprunedaccordingcontributiondegreeFinallyimprovedapplieddifferentdatasetstestsproveperformsbettertypicalmethodsincludingrecurrentBroadEchoStateNetworkReservoirPruningNonstationaryTimeSeriesPrediction

Similar Articles

Cited By