Paired competing neurons improving STDP supervised local learning in Spiking Neural Networks.

Gaspard Goupy, Pierre Tirilly, Ioan Marius Bilasco
Author Information
  1. Gaspard Goupy: Univ. Lille, CNRS, Centrale Lille, UMR 9189 CRIStAL, Lille, France.
  2. Pierre Tirilly: Univ. Lille, CNRS, Centrale Lille, UMR 9189 CRIStAL, Lille, France.
  3. Ioan Marius Bilasco: Univ. Lille, CNRS, Centrale Lille, UMR 9189 CRIStAL, Lille, France.

Abstract

Direct training of Spiking Neural Networks (SNNs) on neuromorphic hardware has the potential to significantly reduce the energy consumption of artificial neural network training. SNNs trained with Spike Timing-Dependent Plasticity (STDP) benefit from gradient-free and unsupervised local learning, which can be easily implemented on ultra-low-power neuromorphic hardware. However, classification tasks cannot be performed solely with unsupervised STDP. In this paper, we propose Stabilized Supervised STDP (S2-STDP), a supervised STDP learning rule to train the classification layer of an SNN equipped with unsupervised STDP for feature extraction. S2-STDP integrates error-modulated weight updates that align neuron spikes with desired timestamps derived from the average firing time within the layer. Then, we introduce a training architecture called Paired Competing Neurons (PCN) to further enhance the learning capabilities of our classification layer trained with S2-STDP. PCN associates each class with paired neurons and encourages neuron specialization toward target or non-target samples through intra-class competition. We evaluate our methods on image recognition datasets, including MNIST, Fashion-MNIST, and CIFAR-10. Results show that our methods outperform state-of-the-art supervised STDP learning rules, for comparable architectures and numbers of neurons. Further analysis demonstrates that the use of PCN enhances the performance of S2-STDP, regardless of the hyperparameter set and without introducing any additional hyperparameters.

Keywords

References

  1. Neural Comput. 2001 Jun;13(6):1255-83 [PMID: 11387046]
  2. Neural Netw. 2020 Jan;121:387-395 [PMID: 31593843]
  3. IEEE Trans Neural Netw Learn Syst. 2023 Apr 07;PP: [PMID: 37027264]
  4. Front Comput Neurosci. 2018 Jun 14;12:46 [PMID: 29962943]
  5. Neural Netw. 2018 Dec;108:365-378 [PMID: 30261415]
  6. Front Neurosci. 2021 Mar 04;15:638474 [PMID: 33746705]
  7. Front Comput Neurosci. 2020 Nov 12;14:576841 [PMID: 33281591]
  8. Front Neurosci. 2021 Nov 04;15:756876 [PMID: 34803591]
  9. Heliyon. 2018 Nov 23;4(11):e00938 [PMID: 30519653]
  10. IEEE Trans Pattern Anal Mach Intell. 2013 Aug;35(8):1798-828 [PMID: 23787338]
  11. Annu Rev Neurosci. 2008;31:25-46 [PMID: 18275283]
  12. Front Neurosci. 2019 Mar 19;13:189 [PMID: 30941003]
  13. Front Neurosci. 2015 Mar 02;9:51 [PMID: 25784849]
  14. Front Neurosci. 2017 Jun 21;11:324 [PMID: 28680387]
  15. Front Neural Circuits. 2016 Jan 19;9:85 [PMID: 26834568]
  16. Int J Neural Syst. 2020 Jun;30(6):2050027 [PMID: 32466691]
  17. Neural Netw. 2018 Mar;99:56-67 [PMID: 29328958]
  18. IEEE Trans Neural Netw Learn Syst. 2022 Dec;33(12):7621-7631 [PMID: 34125691]
  19. Front Comput Neurosci. 2018 Apr 05;12:24 [PMID: 29674961]
  20. Acta Neurobiol Exp (Wars). 2011;71(4):409-33 [PMID: 22237491]
  21. Nature. 2020 Jan;577(7792):641-646 [PMID: 31996818]
  22. Front Neurosci. 2018 Aug 03;12:435 [PMID: 30123103]
  23. Neural Netw. 2001 Jul-Sep;14(6-7):715-25 [PMID: 11665765]
  24. Neural Comput. 2010 Feb;22(2):467-510 [PMID: 19842989]
  25. Neural Comput. 2018 Jun;30(6):1514-1541 [PMID: 29652587]
  26. Materials (Basel). 2020 Jan 01;13(1): [PMID: 31906325]

Word Cloud

Created with Highcharts 10.0.0STDPlearningS2-STDPsupervisedtrainingSpikingNeuralNetworksunsupervisedclassificationlayerPCNneuronsSNNsneuromorphichardwaretrainedlocalneuronPairedintra-classmethodsimagerecognitionDirectpotentialsignificantlyreduceenergyconsumptionartificialneuralnetworkSpikeTiming-DependentPlasticitybenefitgradient-freecaneasilyimplementedultra-low-powerHowevertasksperformedsolelypaperproposeStabilizedSupervisedruletrainSNNequippedfeatureextractionintegrateserror-modulatedweightupdatesalignspikesdesiredtimestampsderivedaveragefiringtimewithinintroducearchitecturecalledCompetingNeuronsenhancecapabilitiesassociatesclasspairedencouragesspecializationtowardtargetnon-targetsamplescompetitionevaluatedatasetsincludingMNISTFashion-MNISTCIFAR-10Resultsshowoutperformstate-of-the-artrulescomparablearchitecturesnumbersanalysisdemonstratesuseenhancesperformanceregardlesshyperparametersetwithoutintroducingadditionalhyperparameterscompetingimprovingWinner-Takes-Allcompetitive

Similar Articles

Cited By