Obstacle avoidance of physical, stereoscopic, and pictorial objects.

Martin Giesel, Daniela Ruseva, Constanze Hesse
Author Information
  1. Martin Giesel: School of Psychology, University of Aberdeen, William Guild Building, Aberdeen, AB24 3FX UK. ORCID
  2. Daniela Ruseva: School of Psychology, University of Aberdeen, William Guild Building, Aberdeen, AB24 3FX UK. ORCID
  3. Constanze Hesse: School of Psychology, University of Aberdeen, William Guild Building, Aberdeen, AB24 3FX UK. ORCID

Abstract

Simulated environments, e.g., virtual or augmented reality environments, are becoming increasingly popular for the investigation and training of motor actions. Yet, so far it remains unclear if results of research and training in those environments transfer in the expected way to natural environments. Here, we investigated the types of visual cues that are required to ensure naturalistic hand movements in simulated environments. We compared obstacle avoidance of physical objects with obstacle avoidance of closely matched 2D and 3D images of the physical objects. Participants were asked to reach towards a target position without colliding with obstacles of varying height that were placed in the movement path. Using a pre-test post-test design, we tested obstacle avoidance for 2D and 3D images of obstacles both before and after exposure to the physical obstacles. Consistent with previous findings, we found that participants initially underestimated the magnitude differences between the obstacles, but after exposure to the physical obstacles avoidance performance for the 3D images became similar to performance for the physical obstacles. No such change was found for 2D images. Our findings highlight the importance of disparity cues for naturalistic motor actions in personal space. Furthermore, they suggest that the observed change in obstacle avoidance for 3D images resulted from a calibration of the disparity cues in the 3D images using an accurate estimate of the egocentric distance to the obstacles gained from the interaction with the physical obstacles.

Keywords

References

  1. J Neurosci. 2013 Oct 23;33(43):17081-8 [PMID: 24155312]
  2. PLoS One. 2015 Apr 09;10(4):e0120025 [PMID: 25856410]
  3. Nat Neurosci. 2006 Nov;9(11):1369-70 [PMID: 17028584]
  4. Cogn Neuropsychol. 2008 Oct;25(7):920-50 [PMID: 18629739]
  5. Neuropsychologia. 2007 Jan 28;45(2):288-94 [PMID: 17045314]
  6. Atten Percept Psychophys. 2009 Aug;71(6):1284-93 [PMID: 19633344]
  7. Perception. 1999;28(2):167-81 [PMID: 10615458]
  8. J Vis. 2015;15(9):3 [PMID: 26161632]
  9. Exp Brain Res. 2023 Jul;241(7):1797-1810 [PMID: 37222777]
  10. Sci Rep. 2020 Dec 18;10(1):22307 [PMID: 33339859]
  11. Atten Percept Psychophys. 2020 Aug;82(6):3176-3195 [PMID: 32406005]
  12. Philos Trans R Soc Lond B Biol Sci. 2009 Dec 12;364(1535):3549-57 [PMID: 19884149]
  13. Proc Biol Sci. 1999 Jan 7;266(1414):39-44 [PMID: 10081157]
  14. Vision (Basel). 2022 Feb 24;6(1): [PMID: 35324599]
  15. Iperception. 2021 Nov 25;12(6):20416695211054534 [PMID: 34868538]
  16. Neuropsychologia. 1999 Dec;37(13):1505-10 [PMID: 10617271]
  17. J Exp Psychol Hum Percept Perform. 2001 Dec;27(6):1314-34 [PMID: 11766927]
  18. J Neurophysiol. 1995 Jul;74(1):457-63 [PMID: 7472347]
  19. Trends Cogn Sci. 2021 Jun;25(6):506-519 [PMID: 33775583]
  20. Exp Brain Res. 2002 May;144(2):262-7 [PMID: 12012164]
  21. Vision Res. 2021 Nov;188:51-64 [PMID: 34289419]
  22. J Mot Behav. 1984 Sep;16(3):235-54 [PMID: 15151851]
  23. Cogn Neuropsychol. 2008 Oct-Dec;25(7-8):891-919 [PMID: 18608333]
  24. Perception. 1995;24(2):155-79 [PMID: 7617423]
  25. Cogn Neuropsychol. 2008 Oct-Dec;25(7-8):920-50 [PMID: 19378412]
  26. Psychol Rev. 2001 Oct;108(4):709-34 [PMID: 11699114]
  27. Vision Res. 2010 Dec;50(24):2627-32 [PMID: 20723556]
  28. J Neurosci. 2012 Feb 8;32(6):2013-7 [PMID: 22323715]
  29. PLoS One. 2018 Jan 2;13(1):e0189275 [PMID: 29293512]
  30. Spat Vis. 1997;10(4):437-42 [PMID: 9176953]
  31. Vision Res. 2011 Apr 22;51(8):812-8 [PMID: 21310170]
  32. Neuropsychologia. 2021 Aug 20;159:107920 [PMID: 34166669]
  33. Cortex. 2016 Jun;79:130-52 [PMID: 27156056]
  34. Exp Brain Res. 2019 Nov;237(11):2761-2766 [PMID: 31485708]
  35. Vision Res. 2017 Nov;140:140-143 [PMID: 28965858]
  36. J Vis. 2022 Mar 2;22(4):9 [PMID: 35315875]
  37. Spat Vis. 2004;17(1-2):95-110 [PMID: 15078014]
  38. Neuropsychologia. 2016 Oct;91:327-334 [PMID: 27553269]
  39. Neuropsychologia. 2008;46(9):2441-4 [PMID: 18407302]
  40. Cogn Neurosci. 2010 Mar;1(1):52-62 [PMID: 24168245]
  41. Exp Brain Res. 2018 Jun;236(6):1775-1787 [PMID: 29663023]
  42. Vision Res. 2015 Jun;111(Pt A):22-30 [PMID: 25872174]
  43. Exp Brain Res. 1995;104(1):107-14 [PMID: 7621928]
  44. Motor Control. 1999 Jul;3(3):237-71 [PMID: 10409797]
  45. Spat Vis. 1997;10(4):433-6 [PMID: 9176952]
  46. Annu Rev Vis Sci. 2019 Sep 15;5:529-547 [PMID: 31283449]
  47. Spat Vis. 2009;22(1):91-103 [PMID: 19055889]
  48. Front Psychol. 2020 Dec 17;11:588428 [PMID: 33391110]
  49. Neuropsychologia. 1994 Oct;32(10):1159-78 [PMID: 7845558]
  50. Exp Brain Res. 2009 Apr;194(3):435-44 [PMID: 19198815]
  51. Psychon Bull Rev. 2009 Apr;16(2):225-37 [PMID: 19293088]
  52. J Neurophysiol. 1999 Mar;81(3):1355-64 [PMID: 10085361]

Word Cloud

Created with Highcharts 10.0.0obstaclesphysicalavoidanceimagesenvironments3Dobstaclecuesobjects2DtrainingmotoractionsnaturalisticmovementsexposurefindingsfoundperformancechangedisparityperceptionSimulatedegvirtualaugmentedrealitybecomingincreasinglypopularinvestigationYetfarremainsunclearresultsresearchtransferexpectedwaynaturalinvestigatedtypesvisualrequiredensurehandsimulatedcomparedcloselymatchedParticipantsaskedreachtowardstargetpositionwithoutcollidingvaryingheightplacedmovementpathUsingpre-testpost-testdesigntestedConsistentpreviousparticipantsinitiallyunderestimatedmagnitudedifferencesbecamesimilarhighlightimportancepersonalspaceFurthermoresuggestobservedresultedcalibrationusingaccurateestimateegocentricdistancegainedinteractionObstaclestereoscopicpictorialBinoculardisparitiesDistanceHandHeightPerceptionactionVR

Similar Articles

Cited By

No available data.