Automatic recognition of parasitic products in stool examination using object detection approach.

Kaung Myat Naing, Siridech Boonsang, Santhad Chuwongin, Veerayuth Kittichai, Teerawat Tongloy, Samrerng Prommongkol, Paron Dekumyoy, Dorn Watthanakulpanich
Author Information
  1. Kaung Myat Naing: Center of Industrial Robot and Automation (CiRA), College of Advanced Manufacturing Innovation, King Mongkut's Institute of Technology Ladkrabang, Bangkok, Thailand.
  2. Siridech Boonsang: Department of Electrical Engineering, School of Engineering, King Mongkut's Institute of Technology Ladkrabang, Bangkok, Thailand.
  3. Santhad Chuwongin: Center of Industrial Robot and Automation (CiRA), College of Advanced Manufacturing Innovation, King Mongkut's Institute of Technology Ladkrabang, Bangkok, Thailand.
  4. Veerayuth Kittichai: Faculty of Medicine, King Mongkut's Institute of Technology Ladkrabang, Bangkok, Thailand.
  5. Teerawat Tongloy: Center of Industrial Robot and Automation (CiRA), College of Advanced Manufacturing Innovation, King Mongkut's Institute of Technology Ladkrabang, Bangkok, Thailand.
  6. Samrerng Prommongkol: Mahidol Bangkok School of Tropical Medicine, Faculty of Tropical Medicine, Mahidol University, Bangkok, Thailand.
  7. Paron Dekumyoy: Department of Helminthology, Faculty of Tropical Medicine, Mahidol University, Bangkok, Thailand.
  8. Dorn Watthanakulpanich: Department of Helminthology, Faculty of Tropical Medicine, Mahidol University, Bangkok, Thailand.

Abstract

Background: Object detection is a new artificial intelligence approach to morphological recognition and labeling parasitic pathogens. Due to the lack of equipment and trained personnel, artificial intelligence innovation for searching various parasitic products in stool examination will enable patients in remote areas of undeveloped countries to access diagnostic services. Because object detection is a developing approach that has been tested for its effectiveness in detecting intestinal parasitic objects such as protozoan cysts and helminthic eggs, it is suitable for use in rural areas where many factors supporting laboratory testing are still lacking. Based on the literatures, the YOLOv4-Tiny produces faster results and uses less memory with the support of low-end GPU devices. In comparison to the YOLOv3 and YOLOv3-Tiny models, this study aimed to propose an automated object detection approach, specifically the YOLOv4-Tiny model, for automatic recognition of intestinal parasitic products in stools.
Methods: To identify protozoan cysts and helminthic eggs in human feces, the three YOLO approaches; YOLOv4-Tiny, YOLOv3, and YOLOv3-Tiny, were trained to recognize 34 intestinal parasitic classes using training of image dataset. Feces were processed using a modified direct smear method adapted from the simple direct smear and the modified Kato-Katz methods. The image dataset was collected from intestinal parasitic objects discovered during stool examination and the three YOLO models were trained to recognize the image datasets.
Results: The non-maximum suppression technique and the threshold level were used to analyze the test dataset, yielding results of 96.25% precision and 95.08% sensitivity for YOLOv4-Tiny. Additionally, the YOLOv4-Tiny model had the best AUPRC performance of the three YOLO models, with a score of 0.963.
Conclusion: This study, to our knowledge, was the first to detect protozoan cysts and helminthic eggs in the 34 classes of intestinal parasitic objects in human stools.

Keywords

Associated Data

figshare | 10.6084/m9.figshare.19200404.v3

References

  1. Vet Parasitol. 1999 Mar 1;81(3):201-10 [PMID: 10190864]
  2. Sensors (Basel). 2021 Feb 02;21(3): [PMID: 33540500]
  3. Sci Rep. 2021 Mar 1;11(1):4838 [PMID: 33649429]
  4. Trans R Soc Trop Med Hyg. 2001 May-Jun;95(3):304-6 [PMID: 11491004]
  5. Proc IEEE Inst Electr Electron Eng. 2021 May;109(5):820-838 [PMID: 37786449]
  6. PLoS Negl Trop Dis. 2011 Mar 01;5(3):e974 [PMID: 21390157]
  7. BMC Infect Dis. 2019 Sep 14;19(1):808 [PMID: 31521133]
  8. J Immigr Minor Health. 2009 Apr;11(2):115-21 [PMID: 18815883]
  9. Am J Trop Med Hyg. 2011 Apr;84(4):594-8 [PMID: 21460016]
  10. Rev Inst Med Trop Sao Paulo. 1972 Nov-Dec;14(6):397-400 [PMID: 4675644]
  11. IEEE Trans Neural Netw Learn Syst. 2019 Nov;30(11):3212-3232 [PMID: 30703038]
  12. Parasitol Int. 1999 Mar;48(1):63-71 [PMID: 11269327]
  13. Sci Rep. 2021 Nov 23;11(1):22744 [PMID: 34815490]
  14. Sensors (Basel). 2022 Jan 08;22(2): [PMID: 35062425]
  15. Southeast Asian J Trop Med Public Health. 1996 Sep;27(3):562-5 [PMID: 9185270]
  16. Southeast Asian J Trop Med Public Health. 2011 Jul;42(4):782-92 [PMID: 22299460]
  17. Parasit Vectors. 2011 Oct 07;4:195 [PMID: 21981979]
  18. Sensors (Basel). 2022 Jan 06;22(2): [PMID: 35062379]
  19. Ann Trop Med Parasitol. 2008 Sep;102(6):521-8 [PMID: 18782491]
  20. Southeast Asian J Trop Med Public Health. 2002 Jun;33(2):218-23 [PMID: 12236415]
  21. Parasitology. 1998 Jan;116 ( Pt 1):21-8 [PMID: 9481770]
  22. IEEE Trans Biomed Eng. 2001 Jun;48(6):718-30 [PMID: 11396601]
  23. Acta Trop. 2013 Apr;126(1):37-42 [PMID: 23318934]
  24. J Parasitol. 1949 Apr;35(2):125-35 [PMID: 18119158]
  25. Med Phys. 2020 Sep;47(9):4212-4222 [PMID: 32583463]
  26. Sensors (Basel). 2020 Aug 31;20(17): [PMID: 32878345]
  27. PLoS One. 2017 Apr 14;12(4):e0175646 [PMID: 28410387]
  28. J Helminthol. 1996 Jun;70(2):143-51 [PMID: 8960211]
  29. JOP. 2013 Jan 10;14(1):88-91 [PMID: 23306343]
  30. Parasite. 2013;20:14 [PMID: 23587184]
  31. Sci Rep. 2021 Jan 14;11(1):1447 [PMID: 33446897]
  32. Korean J Parasitol. 2014 Feb;52(1):51-6 [PMID: 24623882]
  33. PLoS One. 2015 Mar 04;10(3):e0118432 [PMID: 25738806]
  34. Parasitol Res. 2015 Oct;114(10):3807-13 [PMID: 26202840]
  35. J Biophotonics. 2019 Sep;12(9):e201800410 [PMID: 31081258]

Word Cloud

Created with Highcharts 10.0.0parasiticdetectionapproachproductsintestinalYOLOv4-TinyrecognitionYOLOimagedatasettrainedstoolexaminationobjectobjectsprotozoancystshelminthiceggsmodelsthreeusingObjectartificialintelligenceareasresultsYOLOv3YOLOv3-Tinystudymodelstoolshumanrecognize34classesmodifieddirectsmearParasiteBackground:newmorphologicallabelingpathogensDuelackequipmentpersonnelinnovationsearchingvariouswillenablepatientsremoteundevelopedcountriesaccessdiagnosticservicesdevelopingtestedeffectivenessdetectingsuitableuseruralmanyfactorssupportinglaboratorytestingstilllackingBasedliteraturesproducesfasteruseslessmemorysupportlow-endGPUdevicescomparisonaimedproposeautomatedspecificallyautomaticMethods:identifyfecesapproachestrainingFecesprocessedmethodadaptedsimpleKato-KatzmethodscollecteddiscovereddatasetsResults:non-maximumsuppressiontechniquethresholdlevelusedanalyzetestyielding9625%precision9508%sensitivityAdditionallybestAUPRCperformancescore0963Conclusion:knowledgefirstdetectAutomaticParasitic

Similar Articles

Cited By