AHaH computing-from metastable switches to attractors to machine learning.

Michael Alexander Nugent, Timothy Wesley Molter
Author Information
  1. Michael Alexander Nugent: M. Alexander Nugent Consulting, Santa Fe, New Mexico, United States of America ; KnowmTech LLC, Albuquerque, New Mexico, United States of America ; Xeiam LLC, Santa Fe, New Mexico, United States of America.
  2. Timothy Wesley Molter: M. Alexander Nugent Consulting, Santa Fe, New Mexico, United States of America ; KnowmTech LLC, Albuquerque, New Mexico, United States of America ; Xeiam LLC, Santa Fe, New Mexico, United States of America.

Abstract

Modern computing architecture based on the separation of memory and processing leads to a well known problem called the von Neumann bottleneck, a restrictive limit on the data bandwidth between CPU and RAM. This paper introduces a new approach to computing we call AHaH computing where memory and processing are combined. The idea is based on the attractor dynamics of volatile dissipative electronics inspired by biological systems, presenting an attractive alternative architecture that is able to adapt, self-repair, and learn from interactions with the environment. We envision that both von Neumann and AHaH computing architectures will operate together on the same machine, but that the AHaH computing processor may reduce the power consumption and processing time for certain adaptive learning tasks by orders of magnitude. The paper begins by drawing a connection between the properties of volatility, thermodynamics, and Anti-Hebbian and Hebbian (AHaH) plasticity. We show how AHaH synaptic plasticity leads to attractor states that extract the independent components of applied data streams and how they form a computationally complete set of logic functions. After introducing a general memristive device model based on collections of metastable switches, we show how adaptive synaptic weights can be formed from differential pairs of incremental memristors. We also disclose how arrays of synaptic weights can be used to build a neural node circuit operating AHaH plasticity. By configuring the attractor states of the AHaH node in different ways, high level machine learning functions are demonstrated. This includes unsupervised clustering, supervised and unsupervised classification, complex signal prediction, unsupervised robotic actuation and combinatorial optimization of procedures-all key capabilities of biological nervous systems and modern machine learning algorithms with real world application.

References

  1. Nanoscale. 2011 Sep 1;3(9):3833-40 [PMID: 21847501]
  2. Symp Soc Exp Biol. 1986;40:5-29 [PMID: 3544310]
  3. Nanotechnology. 2013 Sep 27;24(38):384010 [PMID: 23999381]
  4. Nanotechnology. 2011 Jan 7;22(1):015201 [PMID: 21135450]
  5. Nat Mater. 2006 Apr;5(4):312-20 [PMID: 16565712]
  6. Philos Trans R Soc Lond B Biol Sci. 2015 Apr 19;370(1666): [PMID: 25750229]
  7. Neural Netw. 2013 Sep;45:1-3 [PMID: 23899498]
  8. Adv Mater. 2010 Apr 22;22(16):1831-4 [PMID: 20512956]
  9. Science. 2012 Nov 30;338(6111):1202-5 [PMID: 23197532]
  10. Proc Natl Acad Sci U S A. 1982 Apr;79(8):2554-8 [PMID: 6953413]
  11. Nature. 2000 Sep 28;407(6803):470 [PMID: 11028990]
  12. Proc Natl Acad Sci U S A. 1922 Jun;8(6):147-51 [PMID: 16576642]
  13. J Neurosci. 1982 Jan;2(1):32-48 [PMID: 7054394]
  14. J Physiol. 1953 Jan;119(1):69-88 [PMID: 13035718]
  15. IEEE Trans Cybern. 2013 Feb;43(1):269-85 [PMID: 22851278]
  16. Exp Gerontol. 2003 Jan-Feb;38(1-2):95-9 [PMID: 12543266]
  17. Psychol Rev. 1958 Nov;65(6):386-408 [PMID: 13602029]
  18. Nano Lett. 2009 Oct;9(10):3640-5 [PMID: 19722537]
  19. Vision Res. 1997 Dec;37(23):3327-38 [PMID: 9425547]
  20. Nano Lett. 2010 Apr 14;10(4):1297-301 [PMID: 20192230]
  21. Nano Lett. 2012 Jan 11;12(1):389-95 [PMID: 22141918]
  22. IEEE Trans Neural Netw. 2009 Sep;20(9):1417-38 [PMID: 19635693]
  23. Nano Lett. 2012 May 9;12(5):2179-86 [PMID: 21668029]
  24. Annu Rev Phytopathol. 2003;41:455-82 [PMID: 12730390]
  25. ACS Nano. 2011 Sep 27;5(9):7669-76 [PMID: 21861506]
  26. Elife. 2013 Jun 25;2:e00669 [PMID: 23805380]
  27. Planta. 1984 Apr;160(5):392-9 [PMID: 24258665]
  28. Nano Lett. 2009 Feb;9(2):870-4 [PMID: 19206536]
  29. J Physiol. 1959 Oct;148:574-91 [PMID: 14403679]
  30. Sci Rep. 2013;3:1619 [PMID: 23563810]
  31. IEEE Trans Neural Netw. 2003;14(6):1569-72 [PMID: 18244602]
  32. Nat Nanotechnol. 2008 Jul;3(7):429-33 [PMID: 18654568]

MeSH Term

Algorithms
Artificial Intelligence
Cluster Analysis
Computer Simulation
Electric Conductivity
Electronics
Models, Theoretical
Robotics
Signal Processing, Computer-Assisted

Word Cloud

Created with Highcharts 10.0.0AHaHcomputingmachinelearningbasedprocessingattractorplasticitysynapticunsupervisedarchitecturememoryleadsvonNeumanndatapaperbiologicalsystemsadaptiveshowstatesfunctionsmetastableswitchesweightscannodeModernseparationwellknownproblemcalledbottleneckrestrictivelimitbandwidthCPURAMintroducesnewapproachcallcombinedideadynamicsvolatiledissipativeelectronicsinspiredpresentingattractivealternativeableadaptself-repairlearninteractionsenvironmentenvisionarchitectureswilloperatetogetherprocessormayreducepowerconsumptiontimecertaintasksordersmagnitudebeginsdrawingconnectionpropertiesvolatilitythermodynamicsAnti-HebbianHebbianextractindependentcomponentsappliedstreamsformcomputationallycompletesetlogicintroducinggeneralmemristivedevicemodelcollectionsformeddifferentialpairsincrementalmemristorsalsodisclosearraysusedbuildneuralcircuitoperatingconfiguringdifferentwayshighleveldemonstratedincludesclusteringsupervisedclassificationcomplexsignalpredictionroboticactuationcombinatorialoptimizationprocedures-allkeycapabilitiesnervousmodernalgorithmsrealworldapplicationcomputing-fromattractors

Similar Articles

Cited By (6)