Development of a machine vision-based weight prediction system of butterhead lettuce ( L.) using deep learning models for industrial plant factory.

Jung-Sun Gloria Kim, Seongje Moon, Junyoung Park, Taehyeong Kim, Soo Chung
Author Information
  1. Jung-Sun Gloria Kim: Department of Biosystems Engineering, Seoul National University, Seoul, Republic of Korea.
  2. Seongje Moon: Department of Biosystems Engineering, Seoul National University, Seoul, Republic of Korea.
  3. Junyoung Park: Department of Biosystems Engineering, Seoul National University, Seoul, Republic of Korea.
  4. Taehyeong Kim: Department of Biosystems Engineering, Seoul National University, Seoul, Republic of Korea.
  5. Soo Chung: Department of Biosystems Engineering, Seoul National University, Seoul, Republic of Korea.

Abstract

Introduction: Indoor agriculture, especially plant factories, becomes essential because of the advantages of cultivating crops yearly to address global food shortages. Plant factories have been growing in scale as commercialized. Developing an on-site system that estimates the fresh weight of crops non-destructively for decision-making on harvest time is necessary to maximize yield and profits. However, a multi-layer growing environment with on-site workers is too confined and crowded to develop a high-performance system.This research developed a machine vision-based fresh weight estimation system to monitor crops from the transplant stage to harvest with less physical labor in an on-site industrial plant factory.
Methods: A linear motion guide with a camera rail moving in both the x-axis and y-axis directions was produced and mounted on a cultivating rack with a height under 35 cm to get consistent images of crops from the top view. Raspberry Pi4 controlled its operation to capture images automatically every hour. The fresh weight was manually measured eleven times for four months to use as the ground-truth weight of the models. The attained images were preprocessed and used to develop weight prediction models based on manual and automatic feature extraction.
Results and discussion: The performance of models was compared, and the best performance among them was the automatic feature extraction-based model using convolutional neural networks (CNN; ResNet18). The CNN-based model on automatic feature extraction from images performed much better than any other manual feature extraction-based models with 0.95 of the coefficients of determination (R) and 8.06 g of root mean square error (RMSE). However, another multiplayer perceptron model (MLP_2) was more appropriate to be adopted on-site since it showed around nine times faster inference time than CNN with a little less R (0.93). Through this study, field workers in a confined indoor farming environment can measure the fresh weight of crops non-destructively and easily. In addition, it would help to decide when to harvest on the spot.

Keywords

References

  1. Hortic Res. 2020 Aug 1;7:124 [PMID: 32821407]
  2. Front Plant Sci. 2022 Oct 19;13:999106 [PMID: 36340373]
  3. Front Plant Sci. 2016 Mar 31;7:392 [PMID: 27066040]
  4. Front Plant Sci. 2022 Aug 31;13:982562 [PMID: 36119576]
  5. Foods. 2022 Aug 02;11(15): [PMID: 35954068]
  6. Front Plant Sci. 2023 May 19;14:1193158 [PMID: 37275252]
  7. Front Plant Sci. 2022 Sep 29;13:947690 [PMID: 36247622]
  8. Sensors (Basel). 2023 Mar 08;23(6): [PMID: 36991638]
  9. Proc Jpn Acad Ser B Phys Biol Sci. 2013;89(10):447-61 [PMID: 24334509]
  10. Front Plant Sci. 2022 Mar 03;13:706042 [PMID: 35310645]
  11. Sensors (Basel). 2022 Jul 23;22(15): [PMID: 35898004]
  12. Front Plant Sci. 2022 Aug 25;13:980581 [PMID: 36092436]

Word Cloud

Created with Highcharts 10.0.0weightcropssystemmodelsplanton-sitefreshimagesfeaturemodelharvestfactoryautomaticagriculturefactoriescultivatinggrowingcommercializednon-destructivelytimeHoweverenvironmentworkersconfineddevelopmachinevision-basedlessindustriallinearmotionguidetimespredictionmanualextractionperformanceextraction-basedusingconvolutionalneuralnetworksCNN0RindoorfarmingIntroduction:IndoorespeciallybecomesessentialadvantagesyearlyaddressglobalfoodshortagesPlantscaleDevelopingestimatesdecision-makingnecessarymaximizeyieldprofitsmulti-layercrowdedhigh-performanceThisresearchdevelopedestimationmonitortransplantstagephysicallaborMethods:camerarailmovingx-axisy-axisdirectionsproducedmountedrackheight35cmgetconsistenttopviewRaspberryPi4controlledoperationcaptureautomaticallyeveryhourmanuallymeasuredelevenfourmonthsuseground-truthattainedpreprocessedusedbasedResultsdiscussion:comparedbestamongResNet18CNN-basedperformedmuchbetter95coefficientsdetermination806grootmeansquareerrorRMSEanothermultiplayerperceptronMLP_2appropriateadoptedsinceshowedaroundninefasterinferencelittle93studyfieldcanmeasureeasilyadditionhelpdecidespotDevelopmentbutterheadlettuceLdeeplearningcomputervisioncontrolled-environmentdataacquisitionregression

Similar Articles

Cited By

No available data.