Learning noise-induced transitions by multi-scaling reservoir computing.

Zequn Lin, Zhaofan Lu, Zengru Di, Ying Tang
Author Information
  1. Zequn Lin: Institute of Fundamental and Frontier Sciences, University of Electronic Science and Technology of China, Chengdu, 611731, China. ORCID
  2. Zhaofan Lu: Department of Systems Science, Faculty of Arts and Sciences, Beijing Normal University, Zhuhai, 519087, China.
  3. Zengru Di: Department of Systems Science, Faculty of Arts and Sciences, Beijing Normal University, Zhuhai, 519087, China. ORCID
  4. Ying Tang: Institute of Fundamental and Frontier Sciences, University of Electronic Science and Technology of China, Chengdu, 611731, China. jamestang23@gmail.com. ORCID

Abstract

Noise is usually regarded as adversarial to extracting effective dynamics from time series, such that conventional approaches usually aim at learning dynamics by mitigating the noisy effect. However, noise can have a functional role in driving transitions between stable states underlying many stochastic dynamics. We find that leveraging a machine learning model, reservoir computing, can learn noise-induced transitions. We propose a concise training protocol with a focus on a pivotal hyperparameter controlling the time scale. The approach is widely applicable, including a bistable system with white noise or colored noise, where it generates accurate statistics of transition time for white noise and specific transition time for colored noise. Instead, the conventional approaches such as SINDy and the recurrent neural network do not faithfully capture stochastic transitions even for the case of white noise. The present approach is also aware of asymmetry of the bistable potential, rotational dynamics caused by non-detailed balance, and transitions in multi-stable systems. For the experimental data of protein folding, it learns statistics of transition time between folded states, enabling us to characterize transition dynamics from a small dataset. The results portend the exploration of extending the prevailing approaches in learning dynamics from noisy time series.

References

  1. Proc Natl Acad Sci U S A. 2016 Apr 12;113(15):3932-7 [PMID: 27035946]
  2. J Chem Phys. 2018 Feb 14;148(6):064102 [PMID: 29448766]
  3. Phys Rev Lett. 2011 Jun 17;106(24):248102 [PMID: 21770603]
  4. Nat Methods. 2020 Mar;17(3):261-272 [PMID: 32015543]
  5. Nat Commun. 2021 Sep 21;12(1):5564 [PMID: 34548491]
  6. Phys Rev E. 2023 May;107(5-1):054209 [PMID: 37329034]
  7. Nat Commun. 2018 Nov 23;9(1):4950 [PMID: 30470743]
  8. Neural Netw. 2018 Dec;108:495-508 [PMID: 30317134]
  9. Chaos. 2020 Dec;30(12):123126 [PMID: 33380032]
  10. Neuron. 2009 Aug 27;63(4):544-57 [PMID: 19709635]
  11. Mach Learn Sci Technol. 2023 Sep;4(3): [PMID: 37693073]
  12. Neural Comput. 1997 Nov 15;9(8):1735-80 [PMID: 9377276]
  13. Neural Comput. 2002 Nov;14(11):2531-60 [PMID: 12433288]
  14. Chaos. 2018 Apr;28(4):043118 [PMID: 31906670]
  15. Nat Commun. 2024 Feb 6;15(1):1117 [PMID: 38321012]
  16. Phys Rev E. 2018 Aug;98(2-1):023111 [PMID: 30253537]
  17. PLoS Comput Biol. 2022 Dec 19;18(12):e1010722 [PMID: 36534709]
  18. Protein Sci. 2002 Jan;11(1):1-5 [PMID: 11742116]
  19. Phys Rev Lett. 2015 Oct 9;115(15):158101 [PMID: 26550754]
  20. Phys Rev E. 2016 May;93(5):052210 [PMID: 27300883]
  21. J Chem Theory Comput. 2022 Jan 11;18(1):59-78 [PMID: 34965117]
  22. Phys Rev Lett. 2018 Jan 12;120(2):024102 [PMID: 29376715]
  23. Science. 2004 Apr 2;304(5667):78-80 [PMID: 15064413]
  24. Nat Phys. 2023 Jan;19(1):52-60 [PMID: 36660164]
  25. Phys Rev E. 2021 Aug;104(2-1):024205 [PMID: 34525517]
  26. Phys Rev E. 2019 Apr;99(4-1):042203 [PMID: 31108603]

Grants

  1. 12322501/National Natural Science Foundation of China (National Science Foundation of China)
  2. 12105014/National Natural Science Foundation of China (National Science Foundation of China)

Word Cloud

Created with Highcharts 10.0.0dynamicstimenoisetransitionstransitionapproacheslearningwhiteusuallyseriesconventionalnoisycanstatesstochasticreservoircomputingnoise-inducedapproachbistablecoloredstatisticsNoiseregardedadversarialextractingeffectiveaimmitigatingeffectHoweverfunctionalroledrivingstableunderlyingmanyfindleveragingmachinemodellearnproposeconcisetrainingprotocolfocuspivotalhyperparametercontrollingscalewidelyapplicableincludingsystemgeneratesaccuratespecificInsteadSINDyrecurrentneuralnetworkfaithfullycaptureevencasepresentalsoawareasymmetrypotentialrotationalcausednon-detailedbalancemulti-stablesystemsexperimentaldataproteinfoldinglearnsfoldedenablinguscharacterizesmalldatasetresultsportendexplorationextendingprevailingLearningmulti-scaling

Similar Articles

Cited By

No available data.