Logarithmic Learning Differential Convolutional Neural Network.

Magombe Yasin, Mehmet Sarıgül, Mutlu Avci
Author Information
  1. Magombe Yasin: Islamic University in Uganda, Kumi Road, P.O. BOX 2555, Mbale, 256, Eastern, Uganda. Electronic address: magombe.yasin@iuiu.ac.ug.
  2. Mehmet Sarıgül: Cukurova University, Computer Engineering Department, Adana, Turkey.
  3. Mutlu Avci: Cukurova University, Biomedical Engineering Department, Adana, Turkey.

Abstract

Convolutional Neural Networks (CNNs) have revolutionized image classification through their innovative design and training methodologies in computer vision. Differential convolutional neural network with simultaneous multidimensional filter realization improved the performance of the convolutional neural network with calculation cost drawback. This paper introduces logarithmic learning integration into the differential Convolutional neural network to overcome the drawback by supplying faster error minimization and convergence. This task is done by incorporating LogRelu activation, a Logarithmic Cost Function, and unique logarithmic learning method. The effectiveness of the proposed approaches are evaluated by using various datasets and SGD/Adam optimizers. The first step is the adaptation of LogRelu activation function to convolutional and differential convolutional neural networks. The experiment results show that LogRelu integration to convolutional neural network and differential convolutional neural network yields performance improvements ranging from 1.61% to 5.44%. The same integration on ResNet-18, ResNet-34, and ResNet-50 enhances top-1 accuracy in the range of 3.07% and 9.96%. In addition to LogRelu activation function, a Logarithmic Cost Function with logarithmic learning method is also proposed and adapted to differential convolutional neural network. These improvements lead to a new differential convolutional neural network named as Logarithmic Differential Convolutional Neural Network (LDiffCNN), It consistently outperforms standard CNN by increasing the accuracy up to 3.02%. Notably, Logarithmic Differential Convolutional Neural Network demonstrates reduced training iterations up to 38% with faster convergence. The experimental results proved the efficiency of the proposed approach.

Keywords

MeSH Term

Neural Networks, Computer
Algorithms

Word Cloud

Created with Highcharts 10.0.0convolutionalneuralnetworkConvolutionalLogarithmicNeuralDifferentialdifferentiallearningLogRelulogarithmicintegrationactivationproposedNetworkNetworkstrainingperformancedrawbackfasterconvergenceCostFunctionmethodfunctionresultsimprovementsaccuracy3CNNsrevolutionizedimageclassificationinnovativedesignmethodologiescomputervisionsimultaneousmultidimensionalfilterrealizationimprovedcalculationcostpaperintroducesovercomesupplyingerrorminimizationtaskdoneincorporatinguniqueeffectivenessapproachesevaluatedusingvariousdatasetsSGD/Adamoptimizersfirststepadaptationnetworksexperimentshowyieldsranging161%544%ResNet-18ResNet-34ResNet-50enhancestop-1range07%996%additionalsoadaptedleadnewnamedLDiffCNNconsistentlyoutperformsstandardCNNincreasing02%Notablydemonstratesreducediterations38%experimentalprovedefficiencyapproachLearningconvolution

Similar Articles

Cited By

No available data.