The effect of choosing optimizer algorithms to improve computer vision tasks: a comparative study.

Esraa Hassan, Mahmoud Y Shams, Noha A Hikal, Samir Elmougy
Author Information
  1. Esraa Hassan: Faculty of Artificial Intelligence, Kafrelsheikh University, Kafrelsheikh, 33516 Egypt. ORCID
  2. Mahmoud Y Shams: Faculty of Artificial Intelligence, Kafrelsheikh University, Kafrelsheikh, 33516 Egypt.
  3. Noha A Hikal: Department of Information Technology, Faculty of Computers and Information, Mansoura University, Mansoura, 35516 Egypt.
  4. Samir Elmougy: Department of Computer Science, Faculty of Computers and Information, Mansoura University, Mansoura, 35516 Egypt.

Abstract

Optimization algorithms are used to improve model accuracy. The optimization process undergoes multiple cycles until convergence. A variety of optimization strategies have been developed to overcome the obstacles involved in the learning process. Some of these strategies have been considered in this study to learn more about their complexities. It is crucial to analyse and summarise optimization techniques methodically from a machine learning standpoint since this can provide direction for future work in both machine learning and optimization. The approaches under consideration include the Stochastic Gradient Descent (SGD), Stochastic Optimization Descent with Momentum, Rung Kutta, Adaptive Learning Rate, Root Mean Square Propagation, Adaptive Moment Estimation, Deep Ensembles, Feedback Alignment, Direct Feedback Alignment, Adfactor, AMSGrad, and Gravity. prove the ability of each optimizer applied to machine learning models. Firstly, tests on a skin cancer using the ISIC standard dataset for skin cancer detection were applied using three common optimizers (Adaptive Moment, SGD, and Root Mean Square Propagation) to explore the effect of the algorithms on the skin images. The optimal training results from the analysis indicate that the performance values are enhanced using the Adam optimizer, which achieved 97.30% accuracy. The second dataset is COVIDx CT images, and the results achieved are 99.07% accuracy based on the Adam optimizer. The result indicated that the utilisation of optimizers such as SGD and Adam improved the accuracy in training, testing, and validation stages.

Keywords

References

  1. IEEE Trans Med Imaging. 2023 Apr;42(4):959-970 [PMID: 36374873]
  2. PeerJ Comput Sci. 2021 Feb 18;7:e358 [PMID: 33817008]
  3. IEEE Access. 2020 Jun 16;8:111347-111354 [PMID: 34192107]
  4. IEEE Trans Neural Netw Learn Syst. 2016 May;27(5):978-92 [PMID: 26054075]
  5. Comput Methods Programs Biomed. 2020 Dec;197:105725 [PMID: 32882594]
  6. IEEE Trans Pattern Anal Mach Intell. 2023 Mar;45(3):3738-3752 [PMID: 35666793]
  7. PeerJ Comput Sci. 2021 Feb 10;7:e364 [PMID: 33817014]
  8. Int J Dermatol. 2010 Sep;49(9):978-86 [PMID: 20883261]
  9. Eur Radiol. 2021 Aug;31(8):6096-6104 [PMID: 33629156]
  10. IEEE Trans Image Process. 2021;30:1989-2002 [PMID: 33444140]
  11. Comput Intell Neurosci. 2020 Sep 24;2020:8821868 [PMID: 33029115]
  12. Neural Netw. 2022 Jan;145:233-247 [PMID: 34773899]
  13. Front Bioeng Biotechnol. 2019 Nov 26;7:358 [PMID: 32039167]
  14. Nat Med. 2020 Jun;26(6):900-908 [PMID: 32424212]

Word Cloud

Created with Highcharts 10.0.0optimizationaccuracylearningoptimizerOptimizationalgorithmsmachineStochasticSGDAdaptiveskinusingimagesAdamimproveprocessstrategiesstudyDescentRungKuttaRootMeanSquarePropagationMomentDeepFeedbackAlignmentappliedcancerdatasetoptimizerseffecttrainingresultsachievedusedmodelundergoesmultiplecyclesconvergencevarietydevelopedovercomeobstaclesinvolvedconsideredlearncomplexitiescrucialanalysesummarisetechniquesmethodicallystandpointsincecanprovidedirectionfutureworkapproachesconsiderationincludeGradientMomentumLearningRateEstimationEnsemblesDirectAdfactorAMSGradGravityproveabilitymodelsFirstlytestsISICstandarddetectionthreecommonexploreoptimalanalysisindicateperformancevaluesenhanced9730%secondCOVIDxCT9907%basedresultindicatedutilisationimprovedtestingvalidationstageschoosingcomputervisiontasks:comparativeensemblesMedicalalgorithmgradientdecent

Similar Articles

Cited By