Prediction of chaotic time series using recurrent neural networks and reservoir computing techniques: A comparative study.

Shahrokh Shahi, Flavio H Fenton, Elizabeth M Cherry
Author Information
  1. Shahrokh Shahi: School of Computational Science and Engineering, Georgia Institute of Technology, Atlanta, GA 30332, United States of America.
  2. Flavio H Fenton: School of Physics, Georgia Institute of Technology, Atlanta, GA 30332, United States of America.
  3. Elizabeth M Cherry: School of Computational Science and Engineering, Georgia Institute of Technology, Atlanta, GA 30332, United States of America.

Abstract

In recent years, machine-learning techniques, particularly deep learning, have outperformed traditional time-series forecasting approaches in many contexts, including univariate and multivariate predictions. This study aims to investigate the capability of (i) gated recurrent neural networks, including long short-term memory (LSTM) and gated recurrent unit (GRU) networks, (ii) reservoir computing (RC) techniques, such as echo state networks (ESNs) and hybrid physics-informed ESNs, and (iii) the nonlinear vector autoregression (NVAR) approach, which has recently been introduced as the next generation RC, for the prediction of chaotic time series and to compare their performance in terms of accuracy, efficiency, and robustness. We apply the methods to predict time series obtained from two widely used chaotic benchmarks, the Mackey-Glass and Lorenz-63 models, as well as two other chaotic datasets representing a bursting neuron and the dynamics of the El Niño Southern Oscillation, and to one experimental dataset representing a time series of cardiac voltage with complex dynamics. We find that even though gated RNN techniques have been successful in forecasting time series generally, they can fall short in predicting chaotic time series for the methods, datasets, and ranges of hyperparameter values considered here. In contrast, for the chaotic datasets studied, we found that reservoir computing and NVAR techniques are more computationally efficient and offer more promise in long-term prediction of chaotic time series.

Keywords

References

  1. Science. 1977 Jul 15;197(4300):287-9 [PMID: 267326]
  2. Neural Netw. 2012 Nov;35:1-9 [PMID: 22885243]
  3. Front Physiol. 2021 Sep 27;12:734178 [PMID: 34646159]
  4. Nat Commun. 2021 Sep 21;12(1):5564 [PMID: 34548491]
  5. Int J Neural Syst. 2021 Mar;31(3):2130001 [PMID: 33588711]
  6. Neural Netw. 2007 Apr;20(3):335-52 [PMID: 17517495]
  7. Phys Rev Lett. 2018 Jan 12;120(2):024102 [PMID: 29376715]
  8. Science. 1986 Apr 11;232(4747):243-5 [PMID: 17780809]
  9. Chaos. 2018 Apr;28(4):041101 [PMID: 31906641]
  10. Chaos. 1998 Mar;8(1):20-47 [PMID: 12779708]
  11. Bull Math Biol. 2003 Sep;65(5):767-93 [PMID: 12909250]
  12. Neural Netw. 2015 Jan;61:85-117 [PMID: 25462637]
  13. Chaos. 2021 Jan;31(1):013108 [PMID: 33754755]
  14. J Comput Biol. 2003;10(3-4):341-56 [PMID: 12935332]
  15. Math Biosci. 2016 Nov;281:46-54 [PMID: 27590776]
  16. Neural Comput. 1997 Nov 15;9(8):1735-80 [PMID: 9377276]
  17. Chaos. 2019 Aug;29(8):083130 [PMID: 31472504]
  18. J Acoust Soc Am. 1988 Apr;83(4):1615-26 [PMID: 3372872]

Grants

  1. R01 HL143450/NHLBI NIH HHS

Word Cloud

Created with Highcharts 10.0.0timeserieschaoticnetworkstechniquescomputinggatedrecurrentneuralreservoirdatasetslearningforecastingincludingstudyRCstateESNsvectorautoregressionNVARpredictionmethodstworepresentingdynamicsrecentyearsmachine-learningparticularlydeepoutperformedtraditionaltime-seriesapproachesmanycontextsunivariatemultivariatepredictionsaimsinvestigatecapabilitylongshort-termmemoryLSTMunitGRUiiechohybridphysics-informediiinonlinearapproachrecentlyintroducednextgenerationcompareperformancetermsaccuracyefficiencyrobustnessapplypredictobtainedwidelyusedbenchmarksMackey-GlassLorenz-63modelswellburstingneuronElNiñoSouthernOscillationoneexperimentaldatasetcardiacvoltagecomplexfindeventhoughRNNsuccessfulgenerallycanfallshortpredictingrangeshyperparametervaluesconsideredcontraststudiedfoundcomputationallyefficientofferpromiselong-termPredictionusingtechniques:comparativeChaoticDeepEchoNonlinearRecurrentReservoir

Similar Articles

Cited By (4)