Bagging Improves the Performance of Deep Learning-Based Semantic Segmentation with Limited Labeled Images: A Case Study of Crop Segmentation for High-Throughput Plant Phenotyping.

Yinglun Zhan, Yuzhen Zhou, Geng Bai, Yufeng Ge
Author Information
  1. Yinglun Zhan: Department of Statistics, University of Nebraska-Lincoln, Lincoln, NE 68583, USA. ORCID
  2. Yuzhen Zhou: Department of Statistics, University of Nebraska-Lincoln, Lincoln, NE 68583, USA.
  3. Geng Bai: Department of Biological Systems Engineering, University of Nebraska-Lincoln, Lincoln, NE 68583, USA. ORCID
  4. Yufeng Ge: Department of Biological Systems Engineering, University of Nebraska-Lincoln, Lincoln, NE 68583, USA.

Abstract

Advancements in imaging, computer vision, and automation have revolutionized various fields, including field-based high-throughput plant phenotyping (FHTPP). This integration allows for the rapid and accurate measurement of plant traits. Deep Convolutional Neural Networks (DCNNs) have emerged as a powerful tool in FHTPP, particularly in crop segmentation-identifying crops from the background-crucial for trait analysis. However, the effectiveness of DCNNs often hinges on the availability of large, labeled datasets, which poses a challenge due to the high cost of labeling. In this study, a deep learning with bagging approach is introduced to enhance crop segmentation using high-resolution RGB images, tested on the NU-Spidercam dataset from maize plots. The proposed method outperforms traditional machine learning and deep learning models in prediction accuracy and speed. Remarkably, it achieves up to 40% higher Intersection-over-Union (IoU) than the threshold method and 11% over conventional machine learning, with significantly faster prediction times and manageable training duration. Crucially, it demonstrates that even small labeled datasets can yield high accuracy in semantic segmentation. This approach not only proves effective for FHTPP but also suggests potential for broader application in remote sensing, offering a scalable solution to semantic segmentation challenges. This paper is accompanied by publicly available source code.

Keywords

References

  1. PLoS One. 2018 Apr 27;13(4):e0196615 [PMID: 29702690]
  2. Plant Methods. 2020 Jul 09;16:95 [PMID: 32670387]
  3. Front Plant Sci. 2018 Jul 11;9:1002 [PMID: 30050552]
  4. Nature. 2015 May 28;521(7553):436-44 [PMID: 26017442]
  5. Neural Netw. 2020 Jan;121:74-87 [PMID: 31536901]
  6. Sci Rep. 2019 Oct 1;9(1):14089 [PMID: 31575995]
  7. Front Plant Sci. 2022 Nov 23;13:1064219 [PMID: 36507404]
  8. J Big Data. 2021;8(1):101 [PMID: 34306963]
  9. Front Plant Sci. 2019 Jan 07;9:1933 [PMID: 30666264]
  10. IEEE Trans Pattern Anal Mach Intell. 2022 Jul;44(7):3523-3542 [PMID: 33596172]
  11. J Big Data. 2021;8(1):53 [PMID: 33816053]
  12. Front Bioeng Biotechnol. 2021 Jan 13;8:623705 [PMID: 33520974]
  13. IEEE Trans Pattern Anal Mach Intell. 2010 Jan;32(1):105-19 [PMID: 19926902]
  14. IEEE Trans Pattern Anal Mach Intell. 2017 Dec;39(12):2481-2495 [PMID: 28060704]
  15. IEEE Trans Pattern Anal Mach Intell. 2018 Apr;40(4):834-848 [PMID: 28463186]

Grants

  1. 2020-68013-32371/United States Department of Agriculture
  2. Accession Number 7000908/United States Department of Agriculture

MeSH Term

Deep Learning
Crops, Agricultural
Phenotype
Zea mays
Image Processing, Computer-Assisted
Neural Networks, Computer
Semantics

Word Cloud

Created with Highcharts 10.0.0learningFHTPPsegmentationplantdeepsemanticfield-basedhigh-throughputphenotypingDeepDCNNscroplabeleddatasetshighbaggingapproachhigh-resolutionRGBmethodmachinepredictionaccuracySegmentationAdvancementsimagingcomputervisionautomationrevolutionizedvariousfieldsincludingintegrationallowsrapidaccuratemeasurementtraitsConvolutionalNeuralNetworksemergedpowerfultoolparticularlysegmentation-identifyingcropsbackground-crucialtraitanalysisHowevereffectivenessoftenhingesavailabilitylargeposeschallengeduecostlabelingstudyintroducedenhanceusingimagestestedNU-SpidercamdatasetmaizeplotsproposedoutperformstraditionalmodelsspeedRemarkablyachieves40%higherIntersection-over-UnionIoUthreshold11%conventionalsignificantlyfastertimesmanageabletrainingdurationCruciallydemonstratesevensmallcanyieldproveseffectivealsosuggestspotentialbroaderapplicationremotesensingofferingscalablesolutionchallengespaperaccompaniedpubliclyavailablesourcecodeBaggingImprovesPerformanceLearning-BasedSemanticLimitedLabeledImages:CaseStudyCropHigh-ThroughputPlantPhenotypingimage

Similar Articles

Cited By