Optimizing quantum noise-induced reservoir computing for nonlinear and chaotic time series prediction.

Daniel Fry, Amol Deshmukh, Samuel Yen-Chi Chen, Vladimir Rastunkov, Vanio Markov
Author Information
  1. Daniel Fry: IBM Quantum, Thomas J. Watson Research Center, Yorktown Heights, NY, USA. daniel.fry@ibm.com.
  2. Amol Deshmukh: IBM Quantum, Thomas J. Watson Research Center, Yorktown Heights, NY, USA.
  3. Samuel Yen-Chi Chen: Wells Fargo, 150 East 42 Street, New York, NY, 10017, USA.
  4. Vladimir Rastunkov: IBM Quantum, Thomas J. Watson Research Center, Yorktown Heights, NY, USA.
  5. Vanio Markov: Wells Fargo, 150 East 42 Street, New York, NY, 10017, USA.

Abstract

Quantum reservoir computing is strongly emerging for sequential and time series data prediction in quantum machine learning. We make advancements to the quantum noise-induced reservoir, in which reservoir noise is used as a resource to generate expressive, nonlinear signals that are efficiently learned with a single linear output layer. We address the need for quantum reservoir tuning with a novel and generally applicable approach to quantum circuit parameterization, in which tunable noise models are programmed to the quantum reservoir circuit to be fully controlled for effective optimization. Our systematic approach also involves reductions in quantum reservoir circuits in the number of qubits and entanglement scheme complexity. We show that with only a single noise model and small memory capacities, excellent simulation results were obtained on nonlinear benchmarks that include the Mackey-Glass system for 100 steps ahead in the challenging chaotic regime.

References

  1. Science. 1977 Jul 15;197(4300):287-9 [PMID: 267326]
  2. Phys Rev E. 2022 Oct;106(4):L043301 [PMID: 36397493]
  3. Phys Rev E. 2023 Mar;107(3-2):035306 [PMID: 37072987]
  4. Nat Methods. 2020 Mar;17(3):261-272 [PMID: 32015543]
  5. Neural Comput. 2002 Nov;14(11):2531-60 [PMID: 12433288]
  6. Sci Rep. 2017 Aug 31;7(1):10199 [PMID: 28860513]
  7. IEEE Trans Neural Netw. 2000;11(3):697-709 [PMID: 18249797]
  8. Neural Netw. 2019 Jul;115:100-123 [PMID: 30981085]
  9. Sci Rep. 2012;2:514 [PMID: 22816038]
  10. Phys Rev Lett. 2021 Sep 3;127(10):100501 [PMID: 34533358]
  11. Sci Rep. 2023 May 31;13(1):8790 [PMID: 37258528]
  12. IEEE Trans Neural Netw Learn Syst. 2021 Jul;32(7):3148-3155 [PMID: 32735539]
  13. Phys Rev Lett. 2019 Feb 1;122(4):040504 [PMID: 30768345]
  14. Sci Rep. 2022 Jan 25;12(1):1353 [PMID: 35079045]
  15. Phys Rev E Stat Phys Plasmas Fluids Relat Interdiscip Topics. 2000 Sep;62(3 Pt B):4473-6 [PMID: 11088992]
  16. Sci Rep. 2020 Sep 7;10(1):14687 [PMID: 32895412]
  17. Phys Rev A Gen Phys. 1986 Dec;34(6):4971-4979 [PMID: 9897880]
  18. Science. 2004 Apr 2;304(5667):78-80 [PMID: 15064413]
  19. Nature. 2019 Mar;567(7747):209-212 [PMID: 30867609]

Word Cloud

Created with Highcharts 10.0.0reservoirquantumnoisenonlinearcomputingtimeseriespredictionnoise-inducedsingleapproachcircuitchaoticQuantumstronglyemergingsequentialdatamachinelearningmakeadvancementsusedresourcegenerateexpressivesignalsefficientlylearnedlinearoutputlayeraddressneedtuningnovelgenerallyapplicableparameterizationtunablemodelsprogrammedfullycontrolledeffectiveoptimizationsystematicalsoinvolvesreductionscircuitsnumberqubitsentanglementschemecomplexityshowmodelsmallmemorycapacitiesexcellentsimulationresultsobtainedbenchmarksincludeMackey-Glasssystem100stepsaheadchallengingregimeOptimizing

Similar Articles

Cited By (1)