@artificialandmechanicalint7212
  @artificialandmechanicalint7212
Artificial and Mechanical Intelligence | XBG: End-to-end Imitation Learning for Autonomous Behaviour in Human-Robot Interaction Collaboration @artificialandmechanicalint7212 | Uploaded June 2024 | Updated October 2024, 1 hour ago.
This paper presents XBG (eXteroceptive Behaviour Generation), a multimodal end-to-end Imitation Learning (IL) system for a whole-body autonomous humanoid robot used in real-world Human-Robot Interaction (HRI) scenarios. The main contribution of this paper is an architecture for learning HRI behaviors using a data-driven approach. Through teleoperation, a diverse dataset is collected, comprising demonstrations across multiple HRI scenarios, including handshaking, handwaving, payload reception, walking, and walking with a payload. After synchronizing, filtering, and transforming the data, different Deep Neural Networks (DNN) models are trained. The final system integrates different modalities comprising exteroceptive and proprioceptive sources of information to provide the robot with an understanding of its environment and its own actions. The robot takes sequence of images (RGB and depth) and joints state information during the interactions and then reacts accordingly, demonstrating learned behaviors. By fusing multimodal signals in time, we encode new autonomous capabilities into the robotic platform, allowing the understanding of context changes over time. The models are deployed on ergoCub, a real-world humanoid robot, and their performance is measured by calculating the success rate of the robot's behavior under the mentioned scenarios.
XBG: End-to-end Imitation Learning for Autonomous Behaviour in Human-Robot Interaction Collaboration

XBG: End-to-end Imitation Learning for Autonomous Behaviour in Human-Robot Interaction Collaboration @artificialandmechanicalint7212

SHARE TO X SHARE TO REDDIT SHARE TO FACEBOOK WALLPAPER