An Extension Network of Dendritic Neurons.

Qianyi Peng, Shangce Gao, Yirui Wang, Junyan Yi, Gang Yang, Yuki Todo
Author Information
  1. Qianyi Peng: Faculty of Engineering, University of Toyama, Toyama-shi 930-8555, Japan. ORCID
  2. Shangce Gao: Faculty of Engineering, University of Toyama, Toyama-shi 930-8555, Japan. ORCID
  3. Yirui Wang: Faculty of Electrical Engineering and Computer Science, Ningbo University, Ningbo, Zhejiang 315211, China. ORCID
  4. Junyan Yi: Department of Computer Science & Technology, Beijing University of Civil Engineering and Architecture, Beijing 100044, China. ORCID
  5. Gang Yang: School of Information, Renmin University of China, Beijing, China. ORCID
  6. Yuki Todo: Faculty of Electrical, Information and Communication Engineering, Kanazawa University, Kanazawa, Ishikawa 9201192, Japan. ORCID

Abstract

Deep learning (DL) has achieved breakthrough successes in various tasks, owing to its layer-by-layer information processing and sufficient model complexity. However, DL suffers from the issues of both redundant model complexity and low interpretability, which are mainly because of its oversimplified basic McCulloch-Pitts neuron unit. A widely recognized biologically plausible dendritic neuron model (DNM) has demonstrated its effectiveness in alleviating the aforementioned issues, but it can only solve binary classification tasks, which significantly limits its applicability. In this study, a novel extended network based on the dendritic structure is innovatively proposed, thereby enabling it to solve multiple-class classification problems. Also, for the first time, an efficient error-back-propagation learning algorithm is derived. In the extensive experimental results, the effectiveness and superiority of the proposed method in comparison with other nine state-of-the-art classifiers on ten datasets are demonstrated, including a real-world quality of web service application. The experimental results suggest that the proposed learning algorithm is competent and reliable in terms of classification performance and stability and has a notable advantage in small-scale disequilibrium data. Additionally, aspects of network structure constrained by scale are examined.

References

  1. IEEE Trans Cybern. 2022 Apr 27;PP: [PMID: 35476557]
  2. Nature. 2015 May 28;521(7553):436-44 [PMID: 26017442]
  3. Science. 2020 Jan 3;367(6473):83-87 [PMID: 31896716]
  4. IEEE Trans Neural Netw Learn Syst. 2022 Sep;33(9):4173-4183 [PMID: 33729951]
  5. Comput Intell Neurosci. 2020 Jan 17;2020:2710561 [PMID: 32405292]
  6. Cancer. 2001 Apr 15;91(8 Suppl):1615-35 [PMID: 11309760]
  7. Int J Neural Syst. 2019 Oct;29(8):1950012 [PMID: 31189391]
  8. J Hist Behav Sci. 2002 Winter;38(1):3-25 [PMID: 11835218]
  9. IEEE Trans Neural Netw Learn Syst. 2019 Feb;30(2):601-614 [PMID: 30004892]
  10. IEEE Trans Neural Netw Learn Syst. 2021 Sep 06;PP: [PMID: 34487498]

MeSH Term

Neurons
Algorithms
Software

Word Cloud

Created with Highcharts 10.0.0learningmodelclassificationproposedDLtaskscomplexityissuesneurondendriticdemonstratedeffectivenesssolvenetworkstructurealgorithmexperimentalresultsDeepachievedbreakthroughsuccessesvariousowinglayer-by-layerinformationprocessingsufficientHoweversuffersredundantlowinterpretabilitymainlyoversimplifiedbasicMcCulloch-PittsunitwidelyrecognizedbiologicallyplausibleDNMalleviatingaforementionedcanbinarysignificantlylimitsapplicabilitystudynovelextendedbasedinnovativelytherebyenablingmultiple-classproblemsAlsofirsttimeefficienterror-back-propagationderivedextensivesuperioritymethodcomparisonninestate-of-the-artclassifierstendatasetsincludingreal-worldqualitywebserviceapplicationsuggestcompetentreliabletermsperformancestabilitynotableadvantagesmall-scaledisequilibriumdataAdditionallyaspectsconstrainedscaleexaminedExtensionNetworkDendriticNeurons

Similar Articles

Cited By (1)