Improving Classification Performance in Dendritic Neuron Models through Practical Initialization Strategies.

Xiaohao Wen, Mengchu Zhou, Aiiad Albeshri, Lukui Huang, Xudong Luo, Dan Ning
Author Information
  1. Xiaohao Wen: Teachers College for Vocational and Technical Education, Guangxi Normal University, Guilin 541001, China. ORCID
  2. Mengchu Zhou: Faculty of Innovation Engineering, Macau University of Science and Technology, Macau 999078, China. ORCID
  3. Aiiad Albeshri: Department of Computer Science, King Abdulaziz University, Jeddah 21481, Saudi Arabia. ORCID
  4. Lukui Huang: School of Accounting and Audit, Guangxi University of Finance and Economics, Nanning 530031, China. ORCID
  5. Xudong Luo: Teachers College for Vocational and Technical Education, Guangxi Normal University, Guilin 541001, China. ORCID
  6. Dan Ning: Teachers College for Vocational and Technical Education, Guangxi Normal University, Guilin 541001, China. ORCID

Abstract

A dendritic neuron model (DNM) is a deep neural network model with a unique dendritic tree structure and activation function. Effective initialization of its model parameters is crucial for its learning performance. This work proposes a novel initialization method specifically designed to improve the performance of DNM in classifying high-dimensional data, notable for its simplicity, speed, and straightforward implementation. Extensive experiments on benchmark datasets show that the proposed method outperforms traditional and recent initialization methods, particularly in datasets consisting of high-dimensional data. In addition, valuable insights into the behavior of DNM during training and the impact of initialization on its learning performance are provided. This research contributes to the understanding of the initialization problem in deep learning and provides insights into the development of more effective initialization methods for other types of neural network models. The proposed initialization method can serve as a reference for future research on initialization techniques in deep learning.

Keywords

References

  1. Int J Neural Syst. 2019 Oct;29(8):1950012 [PMID: 31189391]
  2. IEEE Trans Neural Netw Learn Syst. 2019 May;30(5):1286-1295 [PMID: 30281498]
  3. Nature. 2015 May 28;521(7553):436-44 [PMID: 26017442]
  4. IEEE Trans Neural Netw Learn Syst. 2023 Apr;34(4):2119-2132 [PMID: 34520362]
  5. Bull Math Biol. 1990;52(1-2):99-115; discussion 73-97 [PMID: 2185863]
  6. Comput Intell Neurosci. 2023 Jan 23;2023:7037124 [PMID: 36726357]
  7. IEEE Trans Neural Netw Learn Syst. 2019 Feb;30(2):601-614 [PMID: 30004892]
  8. Nature. 1997 Jan 16;385(6613):207-10 [PMID: 9000068]
  9. Neural Netw. 2014 Dec;60:96-103 [PMID: 25170564]
  10. IEEE Trans Neural Netw Learn Syst. 2023 Apr;34(4):2105-2118 [PMID: 34487498]
  11. IEEE Trans Cybern. 2023 Nov;53(11):6829-6842 [PMID: 35476557]
  12. IEEE Trans Neural Netw Learn Syst. 2022 Sep;33(9):4173-4183 [PMID: 33729951]
  13. IEEE Trans Neural Netw Learn Syst. 2023 Jul 06;PP: [PMID: 37410644]
  14. IEEE Trans Neural Netw Learn Syst. 2020 Mar;31(3):710-724 [PMID: 31170081]
  15. J Neurosci. 2011 Jul 27;31(30):10787-802 [PMID: 21795531]

MeSH Term

Neural Networks, Computer
Neurons

Word Cloud

Created with Highcharts 10.0.0initializationlearningmodeldeepdendriticDNMneuralperformancemethodmethodsneuronnetworkhigh-dimensionaldatadatasetsproposedinsightsresearchuniquetreestructureactivationfunctionEffectiveparameterscrucialworkproposesnovelspecificallydesignedimproveclassifyingnotablesimplicityspeedstraightforwardimplementationExtensiveexperimentsbenchmarkshowoutperformstraditionalrecentparticularlyconsistingadditionvaluablebehaviortrainingimpactprovidedcontributesunderstandingproblemprovidesdevelopmenteffectivetypesmodelscanservereferencefuturetechniquesImprovingClassificationPerformanceDendriticNeuronModelsPracticalInitializationStrategiesnetworks

Similar Articles

Cited By