Using hybrid pre-trained models for breast cancer detection.

Sameh Zarif, Hatem Abdulkader, Ibrahim Elaraby, Abdullah Alharbi, Wail S Elkilani, Paweł Pławiak
Author Information
  1. Sameh Zarif: Department of Information Technology, Faculty of Computers and Information, Menoufia University, Shebin El-kom, Menoufia, Egypt. ORCID
  2. Hatem Abdulkader: Department of Information Systems, Faculty of Computers and Information, Menoufia University, Shebin El-kom, Menoufia, Egypt.
  3. Ibrahim Elaraby: Department of Information Systems Management, Higher Institute of Qualitative Studies, Cairo, Egypt.
  4. Abdullah Alharbi: Department of Computer Science, Community College, King Saud University, Riyadh, Saudi Arabia.
  5. Wail S Elkilani: College of Applied Computer Science, King Saud University, Riyadh, Saudi Arabia.
  6. Paweł Pławiak: Department of Computer Science, Faculty of Computer Science and Telecommunications, Cracow University of Technology, Krakow, Poland. ORCID

Abstract

Breast cancer is a prevalent and life-threatening disease that affects women globally. Early detection and access to top-notch treatment are crucial in preventing fatalities from this condition. However, manual breast histopathology image analysis is time-consuming and prone to errors. This study proposed a hybrid deep learning model (CNN+EfficientNetV2B3). The proposed approach utilizes convolutional neural networks (CNNs) for the identification of positive invasive ductal carcinoma (IDC) and negative (non-IDC) tissue using whole slide images (WSIs), which use pre-trained models to classify breast cancer in images, supporting pathologists in making more accurate diagnoses. The proposed model demonstrates outstanding performance with an accuracy of 96.3%, precision of 93.4%, recall of 86.4%, F1-score of 89.7%, Matthew's correlation coefficient (MCC) of 87.6%, the Area Under the Curve (AUC) of a Receiver Operating Characteristic (ROC) curve of 97.5%, and the Area Under the Curve of the Precision-Recall Curve (AUPRC) of 96.8%, which outperforms the accuracy achieved by other models. The proposed model was also tested against MobileNet+DenseNet121, MobileNetV2+EfficientNetV2B0, and other deep learning models, proving more powerful than contemporary machine learning and deep learning approaches.

References

  1. Radiology. 2023 Jun;307(5):e222639 [PMID: 37219445]
  2. Comput Biol Med. 2021 Jul;134:104432 [PMID: 33964737]
  3. Semin Cancer Biol. 2021 Jul;72:214-225 [PMID: 32531273]
  4. Comput Biol Med. 2020 Jul;122:103861 [PMID: 32658738]
  5. Med J Islam Repub Iran. 2020 Oct 20;34:140 [PMID: 33437736]
  6. Comput Biol Med. 2021 Dec;139:104931 [PMID: 34666229]
  7. CA Cancer J Clin. 2023 Jan;73(1):17-48 [PMID: 36633525]
  8. Arch Intern Med. 2010 Jun 14;170(11):987-9 [PMID: 20548013]
  9. Eur J Radiol. 2022 Nov;156:110513 [PMID: 36108478]
  10. J Pers Med. 2022 Sep 01;12(9): [PMID: 36143229]
  11. Clin Breast Cancer. 2023 Jan;23(1):e32-e36 [PMID: 36336565]
  12. Sensors (Basel). 2021 Apr 18;21(8): [PMID: 33919583]
  13. Sensors (Basel). 2022 Jul 20;22(14): [PMID: 35891111]

MeSH Term

Humans
Female
Breast Neoplasms
Area Under Curve
Breast
Carcinoma in Situ
Image Processing, Computer-Assisted

Word Cloud

Created with Highcharts 10.0.0proposedlearningmodelscancerbreastdeepmodelCurvedetectionhybridimagespre-trainedaccuracy964%AreaBreastprevalentlife-threateningdiseaseaffectswomengloballyEarlyaccesstop-notchtreatmentcrucialpreventingfatalitiesconditionHowevermanualhistopathologyimageanalysistime-consumingproneerrorsstudyCNN+EfficientNetV2B3approachutilizesconvolutionalneuralnetworksCNNsidentificationpositiveinvasiveductalcarcinomaIDCnegativenon-IDCtissueusingwholeslideWSIsuseclassifysupportingpathologistsmakingaccuratediagnosesdemonstratesoutstandingperformance3%precision93recall86F1-score897%Matthew'scorrelationcoefficientMCC876%AUCReceiverOperatingCharacteristicROCcurve975%Precision-RecallAUPRC8%outperformsachievedalsotestedMobileNet+DenseNet121MobileNetV2+EfficientNetV2B0provingpowerfulcontemporarymachineapproachesUsing

Similar Articles

Cited By