@KIMLAB2020
  @KIMLAB2020
KIMLAB (Kinetic Intelligent Machine LAB) | [2023 Weekly KIMLAB] Specifying Target Objects in Teleoperation Using Speech and Natural Eye Gaze @KIMLAB2020 | Uploaded December 2023 | Updated October 2024, 9 minutes ago.
In this study, we propose a new intent detection framework to teleoperate robotic arms based on human speech and natural eye gaze. Our framework applies instance segmentation on the robot's camera image and predicts the human's intended object through matching eye-gaze data, instance masks, instance classes, and transcribed words.

***This work will be presented at the IEEE-RAS Humanoids 2023 in Austin.
2023.ieee-humanoids.org

Yu-Chen (Johnny) Chang, Nitish Gandi, Kazuki Shin, Ye-Ji Mun, Katherine Driggs-Campbell, Joohyung Kim, ”Specifying Target Objects in Robot Teleoperation Using Speech and Natural Eye Gaze,” IEEE-RAS International Conference on Humanoid Robots, December 2023
[2023 Weekly KIMLAB] Specifying Target Objects in Teleoperation Using Speech and Natural Eye GazeHoliday Greetings from KIMLAB[2023 Weekly KIMLAB] Exploring the Capabilities of a General-Purpose Robotic Arm in Chess GameplaySelf-Supervised Motion Retargeting with Safety GuaranteePAPRAS:Cage DemoMOMOs Bam Yang Gang (밤양갱)[2023 Weekly KIMLAB] Interactive Robotic Backpack with PAPRAS[KIMLAB x PSYONIC] Turned to the Dark Side (Star Wars General Grievous theme)[2023 Weekly KIMLAB] MOMO: Mobile Object Manipulation OperatorAssembly Instruction for KIMLABs ROBOTIS MINI Upgrade Kit[2023 Weekly KIMLAB] Development of a 3-DOF Interactive Modular Robot with Human-like Head MotionsHalloween 2021: Baymax Chasing SNAPBOT V2

[2023 Weekly KIMLAB] Specifying Target Objects in Teleoperation Using Speech and Natural Eye Gaze @KIMLAB2020

SHARE TO X SHARE TO REDDIT SHARE TO FACEBOOK WALLPAPER