Neuroevolution gives rise to more focused information transfer compared to backpropagation in recurrent neural networks.

Arend Hintze, Christoph Adami
Author Information
  1. Arend Hintze: Department for MicroData Analytics, Dalarna University, Falun, Sweden. ORCID
  2. Christoph Adami: Department of Microbiology and Molecular Genetics, Michigan State University, East Lansing, USA.

Abstract

Artificial neural networks (ANNs) are one of the most promising tools in the quest to develop general artificial intelligence. Their design was inspired by how neurons in natural brains connect and process, the only other substrate to harbor intelligence. Compared to biological brains that are sparsely connected and that form sparsely distributed representations, ANNs instead process information by connecting all nodes of one layer to all nodes of the next. In addition, modern ANNs are trained with backpropagation, while their natural counterparts have been optimized by natural evolution over eons. We study whether the training method influences how information propagates through the brain by measuring the transfer entropy, that is, the information that is transferred from one group of neurons to another. We find that while the distribution of connection weights in optimized networks is largely unaffected by the training method, neuroevolution leads to networks in which information transfer is significantly more focused on small groups of neurons (compared to those trained by backpropagation) while also being more robust to perturbations of the weights. We conclude that the specific attributes of a training method (local vs. global) can significantly affect how information is processed and relayed through the brain, even when the overall performance is similar.

Keywords

References

  1. Ann N Y Acad Sci. 2012 May;1256:49-65 [PMID: 22320231]
  2. J Comput Neurosci. 2011 Feb;30(1):45-67 [PMID: 20706781]
  3. Neuron. 2002 May 30;34(5):841-51 [PMID: 12062029]
  4. Nature. 2015 May 28;521(7553):436-44 [PMID: 26017442]
  5. Phys Rev Lett. 2000 Jul 10;85(2):461-4 [PMID: 10991308]
  6. Behav Brain Sci. 1997 Dec;20(4):657-83; discussion 683-722 [PMID: 10097008]
  7. Entropy (Basel). 2020 Mar 28;22(4): [PMID: 33286159]
  8. Neuron. 2003 Feb 20;37(4):703-18 [PMID: 12597866]
  9. Evol Comput. 2002 Summer;10(2):99-127 [PMID: 12180173]
  10. IEEE Trans Pattern Anal Mach Intell. 2013 Aug;35(8):1798-828 [PMID: 23787338]
  11. Entropy (Basel). 2022 May 21;24(5): [PMID: 35626617]
  12. Nat Neurosci. 2016 Jul;19(7):973-80 [PMID: 27273768]
  13. PLoS Biol. 2008 Jan;6(1):e16 [PMID: 18232737]
  14. Neural Comput. 2013 Aug;25(8):2079-107 [PMID: 23663146]
  15. Phys Rev Lett. 2016 Jun 10;116(23):238701 [PMID: 27341264]
  16. Bull Math Biol. 1990;52(1-2):99-115; discussion 73-97 [PMID: 2185863]
  17. Nat Rev Neurosci. 2019 Feb;20(2):117-127 [PMID: 30552403]
  18. Neural Comput. 1997 Nov 15;9(8):1735-80 [PMID: 9377276]
  19. PLoS Comput Biol. 2014 May 08;10(5):e1003588 [PMID: 24811198]

Word Cloud

Created with Highcharts 10.0.0informationnetworksneuralANNsoneneuronsnaturalbackpropagationtrainingmethodtransferintelligencebrainsprocesssparselynodestrainedoptimizedbrainentropyweightssignificantlyfocusedcomparedArtificialpromisingtoolsquestdevelopgeneralartificialdesigninspiredconnectsubstrateharborComparedbiologicalconnectedformdistributedrepresentationsinsteadconnectinglayernextadditionmoderncounterpartsevolutioneonsstudywhetherinfluencespropagatesmeasuringtransferredgroupanotherfinddistributionconnectionlargelyunaffectedneuroevolutionleadssmallgroupsalsorobustperturbationsconcludespecificattributeslocalvsglobalcanaffectprocessedrelayedevenoverallperformancesimilarNeuroevolutiongivesriserecurrentComputationMemoryRecurrentnetworkTransfer

Similar Articles

Cited By

No available data.