Bio-Inspired Vision and Gesture-Based Robot-Robot Interaction for Human-Cooperative Package Delivery.

Kaustubh Joshi, Abhra Roy Chowdhury
Author Information
  1. Kaustubh Joshi: Department of Mechanical Engineering, University of Maryland, College Park, MD, United States.
  2. Abhra Roy Chowdhury: Centre for Product Design and Manufacturing, Division of Mechanical Engineering, Indian Institute of Science (IISc), Bangalore, India.

Abstract

This research presents a novel bio-inspired framework for two robots interacting together for a cooperative package delivery task with a human-in the-loop. It contributes to eliminating the need for network-based robot-robot interaction in constrained environments. An individual robot is instructed to move in specific shapes with a particular orientation at a certain speed for the other robot to infer using object detection (custom YOLOv4) and depth perception. The shape is identified by calculating the area occupied by the detected polygonal route. A metric for the area's extent is calculated and empirically used to assign regions for specific shapes and gives an overall accuracy of 93.3% in simulations and 90% in a physical setup. Additionally, gestures are analyzed for their accuracy of intended direction, distance, and the target coordinates in the map. The system gives an average positional RMSE of 0.349 in simulation and 0.461 in a physical experiment. A video demonstration of the problem statement along with the simulations and experiments for real world applications has been given here and in Supplementary Material.

Keywords

References

  1. Front Robot AI. 2020 Apr 02;7:36 [PMID: 33501204]
  2. Nature. 2005 May 12;435(7039):205-7 [PMID: 15889092]
  3. Sensors (Basel). 2019 Jan 25;19(3): [PMID: 30691011]
  4. IEEE Trans Pattern Anal Mach Intell. 2018 Jun;40(6):1465-1479 [PMID: 28574342]
  5. J Exp Biol. 1996;199(Pt 1):155-62 [PMID: 9317542]
  6. IEEE Trans Pattern Anal Mach Intell. 2019 Aug;41(8):1797-1812 [PMID: 30530354]

Word Cloud

Created with Highcharts 10.0.0interactionbio-inspiredrobotspecificshapesperceptiongivesaccuracysimulationsphysical0researchpresentsnovelframeworktworobotsinteractingtogethercooperativepackagedeliverytaskhuman-inthe-loopcontributeseliminatingneednetwork-basedrobot-robotconstrainedenvironmentsindividualinstructedmoveparticularorientationcertainspeedinferusingobjectdetectioncustomYOLOv4depthshapeidentifiedcalculatingareaoccupieddetectedpolygonalroutemetricarea'sextentcalculatedempiricallyusedassignregionsoverall933%90%setupAdditionallygesturesanalyzedintendeddirectiondistancetargetcoordinatesmapsystemaveragepositionalRMSE349simulation461experimentvideodemonstrationproblemstatementalongexperimentsrealworldapplicationsgivenSupplementaryMaterialBio-InspiredVisionGesture-BasedRobot-RobotInteractionHuman-CooperativePackageDeliveryRGB-Dmulti-robothuman-robotcooperationpassiveactionrecognitionvisionbasedgestural

Similar Articles

Cited By

No available data.