High-fidelity wheat plant reconstruction using 3D Gaussian splatting and neural radiance fields.

Lewis A G Stuart, Darren M Wells, Jonathan A Atkinson, Simon Castle-Green, Jack Walker, Michael P Pound
Author Information
  1. Lewis A G Stuart: School of Computer Science, University of Nottingham, Nottingham, NG8 1BB, UK. ORCID
  2. Darren M Wells: School of Biosciences, University of Nottingham, Nottingham, LE12 5RD, UK. ORCID
  3. Jonathan A Atkinson: School of Biosciences, University of Nottingham, Nottingham, LE12 5RD, UK. ORCID
  4. Simon Castle-Green: School of Computer Science, University of Nottingham, Nottingham, NG8 1BB, UK. ORCID
  5. Jack Walker: School of Biosciences, University of Nottingham, Nottingham, LE12 5RD, UK.
  6. Michael P Pound: School of Computer Science, University of Nottingham, Nottingham, NG8 1BB, UK. ORCID

Abstract

BACKGROUND: The reconstruction of 3-dimensional (3D) plant models can offer advantages over traditional 2-dimensional approaches by more accurately capturing the complex structure and characteristics of different crops. Conventional 3D reconstruction techniques often produce sparse or noisy representations of plants using software or are expensive to capture in hardware. Recently, view synthesis models have been developed that can generate detailed 3D scenes, and even 3D models, from only RGB images and camera poses. These models offer unparalleled accuracy but are currently data hungry, requiring large numbers of views with very accurate camera calibration.
RESULTS: In this study, we present a view synthesis dataset comprising 20 individual wheat plants captured across 6 different time frames over a 15-week growth period. We develop a camera capture system using 2 robotic arms combined with a turntable, controlled by a re-deployable and flexible image capture framework. We trained each plant instance using two recent view synthesis models: 3D Gaussian splatting (3DGS) and neural radiance fields (NeRF). Our results show that both 3DGS and NeRF produce high-fidelity reconstructed images of a plant subject from views not captured in the initial training sets. We also show that these approaches can be used to generate accurate 3D representations of these plants as point clouds, with 0.74-mm and 1.43-mm average accuracy compared with a handheld scanner for 3DGS and NeRF, respectively.
CONCLUSION: We believe that these new methods will be transformative in the field of 3D plant phenotyping, plant reconstruction, and active vision. To further this cause, we release all robot configuration and control software, alongside our extensive multiview dataset. We also release all scripts necessary to train both 3DGS and NeRF, all trained models data, and final 3D point cloud representations. Our dataset can be accessed via https://plantimages.nottingham.ac.uk/ or https://https://doi.org/10.5524/102661. Our software can be accessed via https://github.com/Lewis-Stuart-11/3D-Plant-View-Synthesis.

Keywords

References

  1. Plant Methods. 2019 Oct 23;15:117 [PMID: 31660060]
  2. IEEE/ACM Trans Comput Biol Bioinform. 2019 Nov-Dec;16(6):2009-2022 [PMID: 29993836]
  3. Plants (Basel). 2022 Aug 25;11(17): [PMID: 36079580]
  4. Plant Phenomics. 2024 Sep 09;6:0235 [PMID: 39252879]
  5. Plant Physiol. 2014 Dec;166(4):1688-98 [PMID: 25332504]
  6. Ann Bot. 2018 Apr 18;121(5):1079-1088 [PMID: 29509841]
  7. Plant Physiol. 2016 Oct;172(2):823-834 [PMID: 27528244]
  8. Breed Sci. 2022 Mar;72(1):75-84 [PMID: 36045893]
  9. Sensors (Basel). 2014 Feb 14;14(2):3001-18 [PMID: 24534920]
  10. Funct Plant Biol. 2016 Feb;44(1):62-75 [PMID: 32480547]
  11. Front Plant Sci. 2020 Dec 09;11:521431 [PMID: 33362806]
  12. Sensors (Basel). 2015 Jul 29;15(8):18587-612 [PMID: 26230701]
  13. Breed Sci. 2022 Mar;72(1):85-95 [PMID: 36045895]
  14. Plant Methods. 2019 Sep 3;15:103 [PMID: 31497064]
  15. Plant Methods. 2023 Jun 23;19(1):60 [PMID: 37353846]
  16. Plant Cell Environ. 2012 Oct;35(10):1799-823 [PMID: 22860982]
  17. Front Plant Sci. 2022 Aug 08;13:897746 [PMID: 36003825]
  18. PLoS One. 2021 Aug 18;16(8):e0256340 [PMID: 34407122]
  19. Front Plant Sci. 2016 Sep 21;7:1392 [PMID: 27708654]
  20. BMC Plant Biol. 2012 May 03;12:63 [PMID: 22553969]
  21. Plant Phenomics. 2020 Mar 12;2020:1848437 [PMID: 33313542]
  22. PeerJ. 2021 Dec 22;9:e12628 [PMID: 35036135]
  23. PLoS One. 2013 Jun 19;8(6):e66428 [PMID: 23840465]
  24. Plant Physiol. 2019 Dec;181(4):1425-1440 [PMID: 31591152]
  25. Breed Sci. 2022 Mar;72(1):31-47 [PMID: 36045890]

Grants

  1. /University of Nottingham

MeSH Term

Triticum
Imaging, Three-Dimensional
Software
Normal Distribution
Image Processing, Computer-Assisted
Algorithms

Word Cloud

Created with Highcharts 10.0.03Dplantreconstructionmodelscan3DGSNeRFusingviewsynthesisrepresentationsplantssoftwarecapturecameradatasetGaussiansplattingneuralradiancefieldsofferapproachesdifferentproducegenerateimagesaccuracydataviewsaccuratewheatcapturedtrainedshowalsopointphenotypingreleaseaccessedviaBACKGROUND:3-dimensionaladvantagestraditional2-dimensionalaccuratelycapturingcomplexstructurecharacteristicscropsConventionaltechniquesoftensparsenoisyexpensivehardwareRecentlydevelopeddetailedscenesevenRGBposesunparalleledcurrentlyhungryrequiringlargenumberscalibrationRESULTS:studypresentcomprising20individualacross6timeframes15-weekgrowthperioddevelopsystem2roboticarmscombinedturntablecontrolledre-deployableflexibleimageframeworkinstancetworecentmodels:resultshigh-fidelityreconstructedsubjectinitialtrainingsetsusedclouds074-mm143-mmaveragecomparedhandheldscannerrespectivelyCONCLUSION:believenewmethodswilltransformativefieldactivevisioncauserobotconfigurationcontrolalongsideextensivemultiviewscriptsnecessarytrainfinalcloudhttps://plantimagesnottinghamacuk/https://https://doiorg/105524/102661https://githubcom/Lewis-Stuart-11/3D-Plant-View-SynthesisHigh-fidelitydigitaltwinimagingmachinelearningrobotics

Similar Articles

Cited By