KIMLAB (Kinetic Intelligent Machine LAB)We conducted trials of Ringbot outdoors on a 400m track. With a power source of 2300mAh 11.1V, Ringbot managed to cover approximately 3km in 37 minutes. We commanded its target speed and direction using a remote joystick controller (Steam Deck), and Ringbot experienced five falls during this trial.
Ringbot Outdoor TestKIMLAB (Kinetic Intelligent Machine LAB)2024-04-08 | We conducted trials of Ringbot outdoors on a 400m track. With a power source of 2300mAh 11.1V, Ringbot managed to cover approximately 3km in 37 minutes. We commanded its target speed and direction using a remote joystick controller (Steam Deck), and Ringbot experienced five falls during this trial.
- Power source two 850mAh 11.1V batteries and two 300mAh 11.1V batteriesWho is riding Ringbot? #starwars #robot #uiuc #kimlabKIMLAB (Kinetic Intelligent Machine LAB)2024-04-14 | Guess who is riding Ringbot?
-Ringbot video youtu.be/DX3DMEreDykMOMOs Bam Yang Gang (밤양갱)KIMLAB (Kinetic Intelligent Machine LAB)2024-03-31 | MOMO has learned the Bam Yang Gang dance moves with its hand dexterity :) By analyzing 2D dance videos, we extract detailed hand skeleton data, allowing us to recreate the moves in 3D using a hand model. With this information, MOMO replicates the dance motions with its arm and hand joints.
RILAB and KIMLAB have been collaborating on research related to robot motion retargeting, which encompasses hand motions. Please stay tuned for additional research results from both labs.
- Reference motion youtube.com/shorts/12tJBuOenYsLow-cost and Easy-to-Build Soft Robotic Skin for Safe and Contact-rich Human-Robot CollaborationKIMLAB (Kinetic Intelligent Machine LAB)2024-03-25 | Disney's Baymax frequently takes the spotlight in many research presentations dedicated to soft and secure physical human-robot interaction (pHRI). KIMLAB's recent paper in TRO showcases a step towards realizing the Baymax concept by enveloping the skeletons of PAPRAS (Plug And Play Robotic Arm System) with soft skins and utilizing them for sensory functions.
This video shows the step-by-step process of attaching our low-cost, easy-to-build soft robotic skins to the PAPRAS. Following that, we demonstrate the physical interaction between the robotic arm, now equipped with the skin, and a human user, which may involve pushing or tapping. We highlight a safety feature designed to automatically halt operation if the human arm becomes caught in the motion of the robot arm's elbow joint.
Toyota Research Institute (TRI) provided funds to support this work as a part of the "Superhuman Multimodal Sensing for Manipulation" project.
Thank you for watching! For detailed technical insights, feel free to access our paper through the following link, as it is an open-access publication.
- Paper K. Park, K. Shin, S. Yamsani, K. Gim and J. Kim, "Low-cost and Easy-to-Build Soft Robotic Skin for Safe and Contact-rich Human-Robot Collaboration," in IEEE Transactions on Robotics, doi: 10.1109/TRO.2024.3378174 ieeexplore.ieee.org/document/10473193
- PAPRAS (Plug And Play Robotic Arm System) Preprint: arxiv.org/abs/2302.09655 Video: youtu.be/2-63T04-Axk3D printable Robot Hand and Tactile Sensor based on Air-pressure and Capacitive Proximity SensingKIMLAB (Kinetic Intelligent Machine LAB)2024-03-18 | This video presents KIMLAB's new three-fingered robotic hand, featuring soft tactile sensors for enhanced grasping capabilities. Leveraging cost-effective 3D printing materials, it ensures robustness and operational efficiency. By integrating readily available sensors, it adeptly detects interaction forces and proximity to conductive objects. With its 3D printed components and 6 servo motors, production is streamlined and cost-effective, coming in at under 370 USD per hand.
Toyota Research Institute (TRI) provided funds to support this work as a part of the "Superhuman Multimodal Sensing for Manipulation" project.
- Paper Sean Taylor*, Kyungseo Park*, Sankalp Yamsani, and Joohyung Kim, "Fully 3D printable Robot Hand and Soft Tactile Sensor based on Air-pressure and Capacitive Proximity Sensing," IEEE International Conference on Robotics and Automation (ICRA2024), May 2024
ieeexplore.ieee.org/document/10610731Ringbot: Monocycle Robot with LegsKIMLAB (Kinetic Intelligent Machine LAB)2024-02-07 | In this video, we present Ringbot, a novel leg-wheel transformer robot incorporating a monocycle mechanism with legs. Ringbot aims to provide versatile mobility by replacing the driver and driving components of a conventional monocycle vehicle with legs mounted on compact driving modules inside the wheel.
Ringbot represents KIMLAB's "ingenuity", demonstrating our ability to bring a sci-fi-like mobile robot to the real world through robotics.
Thank you for watching this video! If you're interested in exploring the technical details further, you can find them in the following link.
- Paper K. G. Gim and J. Kim, "Ringbot: Monocycle Robot With Legs," in IEEE Transactions on Robotics, doi: 10.1109/TRO.2024.3362326 ieeexplore.ieee.org/document/10423226[2023 Weekly KIMLAB] MOMOs Tree Topping TravelKIMLAB (Kinetic Intelligent Machine LAB)2023-12-22 | Embark on MOMO(Mobile Object Manipulation Operator)'s thrilling quest to ignite joy and excitement! Watch as MOMO skillfully places the tree topper, ensuring that every KIMLAB member's holiday season is filled with happiness and brightness. Happy Holidays! 🎄🌟
-MOMO youtu.be/OvcFSEICq9w?feature=shared[2023 Weekly KIMLAB] Specifying Target Objects in Teleoperation Using Speech and Natural Eye GazeKIMLAB (Kinetic Intelligent Machine LAB)2023-12-13 | In this study, we propose a new intent detection framework to teleoperate robotic arms based on human speech and natural eye gaze. Our framework applies instance segmentation on the robot's camera image and predicts the human's intended object through matching eye-gaze data, instance masks, instance classes, and transcribed words.
***This work will be presented at the IEEE-RAS Humanoids 2023 in Austin. 2023.ieee-humanoids.org
Yu-Chen (Johnny) Chang, Nitish Gandi, Kazuki Shin, Ye-Ji Mun, Katherine Driggs-Campbell, Joohyung Kim, ”Specifying Target Objects in Robot Teleoperation Using Speech and Natural Eye Gaze,” IEEE-RAS International Conference on Humanoid Robots, December 2023[CWBB 2023] 3min Lightning Talks for Poster Session!!KIMLAB (Kinetic Intelligent Machine LAB)2023-12-08 | "Can we build Baymax? Part VIII: Let’s talk about Safe, Commercially Viable Humanoids" will be held at Humanoids 2023, in Austin, Texas, on 12 Dec. 2023.
This video features ten lightning talks.
0:00 Video Title 0:05 Cornelia Bauer, Dominik Bauer 3:07 Joseph Byrnes 6:09 Grzegorz Ficht 9:11 Yunho Han 12:15 Ernesto Hernandez Hinojosa 15:17 Donghyeon Kim 18:03 Cornelius Klas 21:05 Niranjan Kumar 24:06 Daegyu Lim 27:08 Young-Woo Sim 30:10 Invited Speakers
*List of Speakers: - 1X Technologies - Bernt Øivind Børnich (CEO and Co-founder) - Agility Robotics - Jonathan Hurst (Co-Founder and Chief Robot Officer) - Apptronik - Nick Paine (CTO) - IHMC - Robert Griffin (Research Scientist, Technical Advisor of Boardwalk Robotics) - Figure AI - Jerry Pratt (CTO) - Fourier Intelligence - Zen Koh, Global (CEO) - PAL Robotics - Francesco Ferro (CEO) - Sanctuary AI - Jeremy Fishel (Principal Researcher) - Toyota's Frontier Research Center - Taro Takahashi (Project Manager, Technical Adviser of TRI) - Unitree - Tony Yang (North America Sales Director)
*Organizers - Christopher Atkeson (Carnegie Mellon University, USA) - Katsu Yamane (Bosch Research, USA) - Joohyung Kim (University of Illinois Urbana-Champaign, USA) - Jinoh Lee (German Aerospace Center, Germany) - Alexander Alspach (Toyota Research Institute, USA)[2023 Weekly KIMLAB] Introduction to Humanoid Robotics ClassKIMLAB (Kinetic Intelligent Machine LAB)2023-11-24 | Since 2020, KIMLAB has dedicated efforts to craft an affordable humanoid robot tailored for educational needs, boasting vital features like a ROS-enabled processor and multi-modal sensory capabilities. By incorporating a commercially available product, we seamlessly integrated an SBC (Orange PI Lite 2), a camera, and an IMU to create a cost-effective humanoid robot, priced at less than $700 in total. The robot has served as the primary instructional resource in the "Introduction to Humanoid Robotics" course at UIUC, where students showcased remarkable final projects, presenting diverse adaptations of this robot.
- Upgrade Kit Assembly Instruction youtu.be/Su98xT55EOg[CWBB 2023] Can we build Baymax? Part VIII: Let’s talk about Safe, Commercially Viable HumanoidsKIMLAB (Kinetic Intelligent Machine LAB)2023-11-20 | The 8th “Can we build Baymax (CWBB)?” workshop, at the Humanoids 2023 conference, will be held in Austin, Texas, on 12 Dec. 2023.
This year’s CWBB workshop will bring together researchers working on safe, commercially viable humanoid robots. In particular, we will draw attention to efforts towards the safe commercialization of humanoids including application domains, safety considerations for real-world deployment, and design considerations for manufacturing and production at scale.
The workshop is in-person only event. Please find more information: baymax.org
*List of Speakers: - 1X Technologies - Bernt Øivind Børnich (CEO and Co-founder) - Agility Robotics - Jonathan Hurst (Co-Founder and Chief Robot Officer) - Apptronik - Nick Paine (CTO) - IHMC - Robert Griffin (Research Scientist, Technical Advisor of Boardwalk Robotics) - Figure AI - Jerry Pratt (CTO) - Fourier Intelligence - Zen Koh, Global (CEO) - PAL Robotics - Francesco Ferro (CEO) - Sanctuary AI - Jeremy Fishel (Principal Researcher) - Toyota's Frontier Research Center - Taro Takahashi (Project Manager, Technical Adviser of TRI) - Unitree - Tony Yang (North America Sales Director)
*Organizers - Christopher Atkeson (Carnegie Mellon University, USA) - Katsu Yamane (Bosch Research, USA) - Joohyung Kim (University of Illinois Urbana-Champaign, USA) - Jinoh Lee (German Aerospace Center, Germany) - Alexander Alspach (Toyota Research Institute, USA)[2023 Weekly KIMLAB] Exploring the Capabilities of a General-Purpose Robotic Arm in Chess GameplayKIMLAB (Kinetic Intelligent Machine LAB)2023-11-17 | As the field of human-robot collaboration continues to grow and autonomous general-purpose service robots become more prevalent, robots need to obtain situational awareness and handle tasks with a limited field of view and workspace. Addressing these challenges, KIMLAB and Prof. Yong Jae Lee at the University of Wisconsin-Madison utilize the game of chess as a testbed, employing a general-purpose robotic arm.
To achieve successful robotic manipulation in playing board games like chess, the robot must follow game rules and prioritize human safety, all while operating in a limited workspace with field-of-view constraints. A pipeline is proposed that considers the rules of the chess game, the constraints of the robot, and environmental conditions to generate valid moves and countermoves. To validate the algorithm, experiments were conducted with the robotic platform, and the parameters of the system were tuned. The proposed algorithm achieved competitive performance, with an average robot turn success rate of 91.94%. This research demonstrates the potential of a general-purpose robot with a single camera to engage safely in competitive tasks with humans, even with these constraints.
***This work will be presented at the IEEE-RAS Humanoids 2023 in Austin. 2023.ieee-humanoids.org
Kazuki Shin, Sankalp Yamsani, Roman Mineyev, Hongyu Chen, Nitish Gandi, Yong Jae Lee, Joohyung Kim, ”Exploring the Capabilities of a General-Purpose Robotic Arm in Chess Gameplay,” IEEE-RAS International Conference on Humanoid Robots, December 2023[2023 Weekly KIMLAB] Orthrus: A Dual-arm Quadrupedal RobotKIMLAB (Kinetic Intelligent Machine LAB)2023-11-10 | Presented herewith is another PAPRAS (Plug-And-Play Robotic Arm System) add-on system engineered to augment the functionalities of the quadrupedal robot, Boston Dynamics Spot. The system adeptly integrates two PAPRAS units onto the Spot, drawing inspiration from the mythological creature Orthrus—a two-headed dog in Greek mythology. Orthrus exhibits compatibility with diverse grippers and incorporates our software stack designed to facilitate seamless communication between Spot and PAPRAS. Our demonstration showcased diverse scenarios, encompassing mobile manipulation, entertainment applications, and human interaction.
Sankalp Yamsani, Sean Taylor, Kazuki Shin, Jooyoung Hong, Dhruv Mathur, Kevin Gim, Joohyung Kim, “Orthrus: A Dual-arm Quadrupedal Robot for Mobile Manipulation and Entertainment Applications,” IEEE International Conference on Robot and Human Interactive Communication (RO-MAN), August 2023
Paper link: ieeexplore.ieee.org/document/10309339[2023 Weekly KIMLAB] MOMOs Jack-O-LanternKIMLAB (Kinetic Intelligent Machine LAB)2023-10-31 | At KIMLAB, we have a unique way of carving Halloween pumpkins! Our MOMO (Mobile Object Manipulation Operator) is equipped with PAPRAS arms featuring prosthetic hands, allowing it to use human tools.
If you're interested in joining us for exciting robotics research, explore the graduate school programs at UIUC in the following links.
https://ece.illinois.edu/admissions/graduate[2023 Weekly KIMLAB] Development of a 3-DOF Interactive Modular Robot with Human-like Head MotionsKIMLAB (Kinetic Intelligent Machine LAB)2023-10-27 | In the realm of robotics, the simplicity of Wall-E's head belies its remarkable capabilities: it mirrors human-like movements and possesses essential perception abilities, facilitated by an integrated camera. This design stands as the epitome of robotic head engineering. At KIMLAB, we have embraced this iconic design, transforming it into a modular robot head compatible with the PAPRAS docking mount. Utilizing the embedded IMU, our robot maintains its orientation even when the base is in motion. Our head module replicates several human head gestures, enriching interaction capabilities. Moreover, we have incorporated facial detection and tracking features, ensuring seamless Human-Robot Interaction for various applications.
Chaerim Moon, Sankalp Yamsani, Joohyung Kim, “Development of a 3-DOF Interactive Modular Robot with Human-like Head Motions,” IEEE International Conference on Robot and Human Interactive Communication (RO-MAN), August 2023
Paper link: ieeexplore.ieee.org/document/10309462[2023 Weekly KIMLAB] What if a Vacuum Robot Has an Arm?KIMLAB (Kinetic Intelligent Machine LAB)2023-10-19 | Since their mass production in the early 2000s, vacuum robots have emerged as highly successful commercial products in the field of home automation. Many companies and innovators have endeavored to enhance these machines by integrating various capabilities. At KIMLAB, we have implemented a mobile manipulator based on a vacuum robot and an add-on mechanism by employing our PAPRAS (Plug-And-Play Robotic Arm System). This video shows that our system navigates specified locations on a map, opens doors, and handles objects on its route, placing them in other positions.
Kevin Genehyub Gim, Sankalp Yamsani, Kazuki Shin, Joohyung Kim, “What if a Vacuum Robot Has an Arm?” International Conference on Ubiquitous Robots, June 2023
Paper: ieeexplore.ieee.org/document/10202493[2023 Weekly KIMLAB] Demo Expo @IROS2023KIMLAB (Kinetic Intelligent Machine LAB)2023-10-12 | As always, performing a live robot demo is no small feat, but KIMLAB members embraced the challenge! MOMO (Mobile Object Manipulation Operator) stole the Demo Expo @IROS2023 by charming everyone as it handed out candies and swept the floor with a broom. We also unveiled our armor-controlled robotic backpack. This week's video features the demos KIMLAB presented at the IROS 2023.[2023 Weekly KIMLAB] Lip–Inspired Passive Jamming Gripper with Teeth StructureKIMLAB (Kinetic Intelligent Machine LAB)2023-10-05 | Imagine dogs effortlessly managing a wide array of items, their versatile mouths allowing them to grasp and hold easily. Inspired by their natural abilities, we crafted a robotic gripper. Enhancing our passive-jamming lip and incorporating the ingenious structure of a dog's teeth, our gripper now possesses the dexterity and precision to handle objects in daily life.
Jooyoung Hong, Kazuki Shin, Dhruv Mathur, Sankalp Yamsani, Joohee Yim, Joohyung Kim, ”Lip–Inspired Passive Jamming Gripper with Teeth Structure,” IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS2023), 2023[2023 Weekly KIMLAB] MOMO: Mobile Object Manipulation OperatorKIMLAB (Kinetic Intelligent Machine LAB)2023-09-28 | Server robots have become a common sight in restaurants. The next step is equipping them with robotic arms for enhanced mobile manipulation. In this direction, KIMLAB is partnering with HDHyundai Robotics on MOMO(Mobile Object Manipulation Operator). MOMO features the HDHyundai Robotics B1 as its mobile base, incorporating the adaptable PAPRAS. There are two mounts on each side of its torso and an extra neck mount, and each side arm gains one more DOF through a prismatic joint. The system's primary goals include autonomously clearing floor obstructions and delivering items to humans without human intervention. For the design of MOMO, substantial consideration was given to the modularity and adaptability of the system. This allows the mobile manipulator to be optimized for diverse tasks and scenarios.
***KIMLAB will bring MOMO to IROS 2023!!! The Demo Expo will be rocking from 3:30-5:00 pm on Wednesday, October 4, 2023, in the Exhibition Hall. Be sure to track us down and witness the live demo at IROS2023.
Sankalp Yamsani, Kevin Gim, Tyler Smithline, Richard Qiu, Chaerim Moon, Sungmin Kang, Roman Mineyev, Kyungseo Park, Yoon-Koo Kang, Seulbi An, Sunghwan Ahn, Joohyung Kim, "MOMO: Mobile Object Manipulation Operator," Demonstration Sessions, IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2023), 2023
HD HYUNDAI ROBOTICS S1 (Based on B1) service.hyundai-robotics.com/en/product_descriptions/hyundai_s1[2023 Weekly KIMLAB] Interactive Robotic Backpack with PAPRASKIMLAB (Kinetic Intelligent Machine LAB)2023-09-20 | The Weekly KIMLAB's second video unveils PAPRAS:Backpack. While movies and animations often showcase characters and superheroes with additional limbs, bringing such concepts into the real world involves a multitude of considerations. In this video, we delve into the development of a skeleton estimator utilizing a multi-IMU system and an interactive motion controller designed for our robotic backpack.
***KIMLAB is gearing up to unleash the PAPRAS: Backpack at the Demo Expo during IROS 2023!!! The Demo Expo will be rocking from 3:30-5:00 pm on Wednesday, October 4, 2023, in the Exhibition Hall. Be sure to track us down and witness the live demo at IROS2023.
Chaerim Moon, Sean Taylor, Kevin G Gim, Sankalp Yamsani, Kazuki Shin, Kyungseo Park, Joohyung Kim, "Robotic Backpack System with Pluggable Supernumerary Limbs," Demonstration Sessions, IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2023), 2023
The backpack hardware was created in 2021, and there are additional video clips showcasing this system. youtu.be/byzNdOhueeQ youtu.be/5O2HC9aiIYU youtu.be/uq5AF7WDP1Y[2023 Weekly KIMLAB] PAPRAS: Plug And Play Robotic Arm SystemKIMLAB (Kinetic Intelligent Machine LAB)2023-09-14 | As the beginning of 2023 Weekly KIMLAB, we want to introduce PAPRAS, Plug-And-Play Robotic Arm System. A series of PAPRAS applications will be posted in coming weeks. If you are interested in details of PAPRAS, please check our paper at the following link.
- KIMLAB members Joohyung Kim, Kyungseo Park, Kevin Gim, Jooyoung Hong, Kazuki Shin, Omar Darwish, Sankalp Yamsani, Chaerim Moon, Sean Taylor
- Undergrad student helpers Will Chen, Min Kyu Kim, Sungmin Kang[CWBB2022] Build a Punyo GripperKIMLAB (Kinetic Intelligent Machine LAB)2022-12-11 | Title: Build a Punyo Gripper
Speaker: Alex Alspach, TRI
Abstract: Toyota Research Institute (TRI) is excited to release the build instructions and design files for our Punyo Bubble Gripper so that our friends and colleagues can test our technology, improve upon it, and take us closer to building robotic assistants for the home!
https://punyo.tech
Want to work with us? We are developing a demonstration kit to share with partners. If you'd like to receive updates, please let us know at https://punyo.tech/survey.
Question or comments? Feel free to message us at punyo-info@tri.global
The Punyo project information is distributed using the Creative Commons Attribution-NonCommercial 4.0 International license.[CWBB2022] Vision-based Tactile Sensor FingerVision: Open-Source Soft/Hardware and CommercializationKIMLAB (Kinetic Intelligent Machine LAB)2022-12-11 | Talk title: Vision-based Tactile Sensor FingerVision: Open-Source Soft/Hardware and Commercialization
Speaker: Akihiko Yamaguchi, Tohoku University
Abstract: We are developing a vision-based tactile sensor FingerVision for robots including its application to robotic manipulation of various, especially deformable objects such as food items. Since 2017, we have open-sourced the technology in order to popularize the vision-based tactile sensing technology in the society. Some succeeding tactile sensors have been made by the other researchers, which would be a part of contributions of the open-source, but we also feel we should more accelerate. In this talk, I will share the experience obtained through this activity, and present the current attempt.
Speaker Bio: Akihiko Yamaguchi received the BE degree from the Kyoto University, in 2006, and the ME and the PhD degrees from Nara Institute of Science and Technology (NAIST), Nara, Japan, in 2008 and 2011, respectively. After graduation, we worked in NAIST as an Assistant Professor (2011-2015), Robotics Institute of Carnegie Mellon University as a postdoctoral fellow (2014-2017), Graduate School of Information Sciences of Tohoku University as an Assistant Professor (2017-2022). Currently, he belongs to FingerVision Inc as CER and CTO (2021-) and Tohoku University as a lecturer (2022-).[CWBB2022] Balancing proprietary IP and open source to build BaymaxKIMLAB (Kinetic Intelligent Machine LAB)2022-12-11 | Title: Balancing proprietary IP and open source to build Baymax
Speaker: Bram Vanderborght, Vrije Universiteit Brussel (VUB) and imec, Belgium
Abstract: To develop the vision of self healing sustainable robots, we developed an unprecedented integrated approach consisting of several essential technology breakthrough innovations on the value chain: Portfolio of sustainable self healing materials, Multi-material manufacturing, Self healing embedded sensors to achieve the first self-healing grippers and bionic hands. For the first time, complex 3D structures were able to fully recover their functional performance after being completely cut in two. By having patents to open source tutorials, we aim for better interfacing, to promote uptake, new applications development with the community.
Projects: http://sherofet.eu http://smartitn.eu http://project-shinto.eu[CWBB2022] Delivering robust solutions using Open SourceKIMLAB (Kinetic Intelligent Machine LAB)2022-12-11 | Title: Delivering robust solutions using Open Source
Speaker: Francesco Ferro, PAL Robotics
Abstract: PAL Robotics’ CEO, Francesco Ferro talked at the Humanoids 2022 workshop "Can We Build Baymax?" explaining the latest developments at PAL Robotics in hardware including advancements in our range of humanoids robots for research, and open software and tutorials available for our platforms.
pal-robotics.com/research[CWBB2022] FRIDA: Framework and Robotics Initiative for Developing ArtsKIMLAB (Kinetic Intelligent Machine LAB)2022-12-11 | FRIDA: Framework and Robotics Initiative for Developing Arts[CWBB2022] NimbRo-OP2(X): RoboCup AdultSize-winning Open-source Humanoid Soccer RobotsKIMLAB (Kinetic Intelligent Machine LAB)2022-12-11 | Title: NimbRoOP2(X): RoboCup AdultSize winning Open source Humanoid Soccer Robots
Speaker: Sven Behnke, University of Bonn
Abstract: For several years, high development and production costs of humanoid robots restricted researchers interested in working in the field. To address this issue, we developed the NimbRo-OP2 and NimbRo-OP2-OP2X robots as capable and affordable adult-sized humanoid platforms aiming to significantly lower the entry barrier for humanoid robot research. With a height of 135cm and weight of only 19kg, the robots can interact in an unmodified, human environment without special safety equipment. The robots are equipped with a powerful on-board computer with GPU, which enables the implementation of state-of-the-art approaches for visual perception and motion planning. The capabilities of the NimbRo-OP2(X) robots, especially in terms of locomotion stability and visual perception, were evaluated at international RoboCup Soccer competitions, where my team NimbRo won the Humanoid AdultSize class multiple times.
Web page: https://www.ais.uni-bonn.de/nimbro/Humanoid
Speaker Bio: Prof. Dr. Sven Behnke holds since 2008 the chair for Autonomous Intelligent Systems at the University of Bonn and heads the Computer Science Institute VI – Intelligent Systems and Robotics. He graduated in 1997 from Martin-Luther-Universität Halle-Wittenberg (Dipl.-Inform.) and received his doctorate in computer science (Dr. rer. nat.) from Freie Universität Berlin in 2002. In his dissertation "Hierarchical Neural Networks for Image Interpretation" he extended forward deep learning models to recurrent models for visual perception. In 2003 he did postdoctoral research on robust speech recognition at the International Computer Science Institute in Berkeley, CA. In 2004-2008 Professor Behnke led the Emmy Noether Junior Research Group "Humanoid Robots" at Albert-Ludwigs-Universität Freiburg. His research interests include cognitive robotics, computer vision, and machine learning. Prof. Behnke received several Best Paper Awards, three Amazon Research Awards (2018-20), a Google Faculty Research Award (2019), and the Ralf-Dahrendorf-Prize of BMBF for the European Research Area (2019). His team NimbRo has won numerous robot competitions (RoboCup Humanoid Soccer, RoboCup@Home, MBZIRC, ANA Avatar XPRIZE).[CWBB2022] Toward Open-Sourcing of Large-area, Multimodal Sensing for Humanoid with Soft SkinKIMLAB (Kinetic Intelligent Machine LAB)2022-12-11 | Title: Toward Open-Sourcing of Large-area, Multimodal Sensing for Humanoid with Soft Skin
Speaker: Van A. Ho, JAIST
Abstract: Human possesses skin as largest soft organ that helps assess the surroundings through physical interactions. Therefore, in order to build a Baymax-like humanoid robot that is fully aware of social and task-oriented interaction, it is considered important for equipping the whole body of the robot with multimodal sensing skin. In this talk, I will introduce our newest development of soft skin with vision-based multimodal sensing, including tactile and proximity, and potential application in building humanoid robots. I will also indicate our attempt in open-sourcing, both software and hardware, the developed skin system so that it could be utilized widely, both in development of humanoid robot and education of soft sensing techniques.
Speaker Bio: Van A. Ho received the Ph.D. degree in robotics from Ritsumeikan Unviersity, Kyoto, Japan, in 2012. From 2017, he joined the Japan Advanced Institute of Science and Technology (JAIST) for setting up a laboratory on soft robotics. His current research interests are soft robotics, soft haptic interaction, tactile sensing, grasping and manipulation, bio-inspired robots. He was recipient of the prestigious Japan Society for the Promotion of Science (JSPS) Research Fellowship for Young Scientist. Ho was the recipient of 2019 IEEE Nagoya Chapter Young Researcher Award, Best Paper Finalists at IEEE SII (2016) and IEEE RoboSoft (2020). He is member of The Robotics Society of Japan (RSJ), and Senior Member of the IEEE. He is serving as Associate Editor for many international conferences, such as IROS, SII, RoboSoft; as well as for journals such as IEEE Transactions for Robotics (T-RO), IEEE Robotics and Automation Letters (RA-L), and Advanced Robotics. He is General Co-Chair of 2023 IEEE/SICE International Symposium on System Integration (SII).[CWBB2022] Open source and motion generation for humanoids: feedback from the Gepettos teamKIMLAB (Kinetic Intelligent Machine LAB)2022-12-11 | Title: Open source and motion generation for humanoids: feedback from the Gepetto's team
Speaker: Olivier Stasse, Laboratory for Analysis and Architecture of Systems (LAAS-CNRS)[CWBB2022] Reachy: open source interactive humanoid platform to explore real world applicationsKIMLAB (Kinetic Intelligent Machine LAB)2022-12-11 | Title: Reachy: open source interactive humanoid platform to explore real world applications
Speaker: Matthieu Lapeyre, Pollen Robotics
Abstract: In this talk, we will give a complete overview of the Reachy robot, an interactive robot made to explore real world applications. We will discuss why we have started to work on it, what is it now and where we are going.
Speaker Bio: Matthieu Lapeyre is a former researcher in humanoid robotics at INRIA Flowers (France). From 2012 to 2016, he was author and project leader of Poppy an open source 3D printed robot and Pypot an open source python library for robot control. He co-funded Pollen Robotics in 2016 to develop technologies for real world robotic manipulation and HRI. He has designed Reachy an open source interactive robot and has built a world-class robotic team (15 people). In 2022, the Pollen team reached the 2nd place at the Avatar Xprixe competition, with the Reachy 2.[CWBB2022] Humanoids: Education and Outreach // Education and Open Source for Humanoid RobotsKIMLAB (Kinetic Intelligent Machine LAB)2022-12-11 | Title: Humanoids: Education and Outreach // Education and Open Source for Humanoid Robots
Speaker: Chris Atkeson and Conrad Tucker (CMU)Babyface Snapbot on the runKIMLAB (Kinetic Intelligent Machine LAB)2022-11-14 | Babyface Snapbot is being chased by Orthrus (double-headed Spot).
The main character in this movie, Snapbot V2 with Babyface, is now fully open-source. You can find all the information, including head designs, in our Snapbot repository.
- Music 1. Mutants (from "Toy Story") 2. Calling All The Monsters (from "A.N.T. Farm") 3. La Tentation De Simba (from "The Lion King") 4. Doc Ock Suite (from "Spider-Man 2") 5. Sid (from "Toy Story")[KIMLAB x PSYONIC] PAPRAS:Orthrus playing Captain AmericaKIMLAB (Kinetic Intelligent Machine LAB)2022-08-26 | ...[KIMLAB x PSYONIC] Turned to the Dark Side (Star Wars General Grievous theme)KIMLAB (Kinetic Intelligent Machine LAB)2022-08-12 | Hello There, KIMLAB's PAPRAS:Orthrus sharpened its Star Wars lightsaber skills thanks to PSYONIC's Ability Hands!!!
This paper presents our implementation of a large-scale biped robot utilizing Hybrid Leg, a 6 DoF serial-parallel mechanism, having lightweight structure, high payload and large workspace. We set our design goal to make a biped robot taller than an average human height. By applying the Hybrid mechanism and design optimization, the robot was built with a height of 1.84m and a weight of 29.05kg. The implemented robot is able to be actuated by the servo motors used in the smaller humanoid robot. The mechanical design of the robot is explained in detail and kinematics analysis is conducted for analytical solutions. Through multi-body dynamics simulations, the proposed robot design and its performance are verified. In addition, the preliminary performance evaluations for the robot hardware are conducted for a squat experiment and in-place walking experiment.Lip-Inspired Passive Jamming GripperKIMLAB (Kinetic Intelligent Machine LAB)2022-03-26 | Lip-Inspired Passive Jamming Gripper
By Jooyoung Hong, Dhruv C Mathur, Joohyung Kim
Soft robotic grippers have an advantage in grasping objects with various shapes as compared to conventional rigid grippers. Soft grippers have been researched to hold objects which are fragile. Recently, researchers have utilized the jamming effect using particles, which makes stiffness varied to hold the objects tightly. In this paper, we propose a lip-inspired soft robotic gripper. This gripper is motivated by animals' oral structure, especially from lips. Lips have various functions: holding, re-grasping, sucking in, and spitting objects. This gripper especially focuses on the functions of holding and re-grasping. The lip-pouch is fabricated including granular particles inside for passive particle jamming. This paper describes how the gripper is motivated by lips with explanations of the anatomy and functions of the lip. An explanation of the passive particle jamming effect follows. We validated the capability of the lip-pouch of the gripper with various objects through experiments. Moreover, we demonstrated re-grasping objects with this gripper.PAPRAS:Backpack (Tribute to Dr. Octopus in Spider-Man)KIMLAB (Kinetic Intelligent Machine LAB)2021-12-23 | PAPRAS:Backpack is a standalone backpack including a battery, a computer, and other components.Holiday Greetings from KIMLABKIMLAB (Kinetic Intelligent Machine LAB)2021-12-21 | Holiday Greetings from KIMLABPAPRAS:Cage DemoKIMLAB (Kinetic Intelligent Machine LAB)2021-11-25 | ...Halloween 2021: Baymax Chasing SNAPBOT V2KIMLAB (Kinetic Intelligent Machine LAB)2021-10-30 | Halloween 2021: Baymax Chasing SNAPBOT V2Bat Bot 2.0: Bio-inspired Anisotropic Skin, Passive Wrist Joints, and Redesigned Flapping MechanismKIMLAB (Kinetic Intelligent Machine LAB)2021-10-11 | Bat Bot 2.0: Bio-inspired Anisotropic Skin, Passive Wrist Joints, and Redesigned Flapping Mechanism
by Jonathan Hoff, Nicole Jeon, Patrick Li, Joohyung Kim
Bat flight has been an underdeveloped area of bioinspired robotics because of the vast complexities of biological bat flight and the over 40 degrees of freedom present in their bodies. The robotic flapping system Bat Bot (B2) has been shown to exhibit fundamental properties of biological bat flight with its articulated wings, its deformable membrane, and its controllable hindlimbs. However, the system is limited in performance by its relatively large mass for the thrust it produces. In an effort to further pursue this important area of flapping flight, we have made several important hardware improvements to the system based on biological inspiration. These include passive wrist joints to reduce negative lift in the upstroke and a novel elastic fiber membrane to mimic the anistropic nature of bat skin for performance and durability. The redesigned flapping mechanism and structure have reduced the weight by 22%, increased the flapping amplitude, lowered mechanical slackness, and improved mass distribution. These hardware improvements are functional together in free-flight tests. This new system Bat Bot 2.0 (B2.0) provides insights into the important elements of design of bat robots, and it brings the goal of complex bat flight maneuvers closer to reality.Self-Supervised Motion Retargeting with Safety GuaranteeKIMLAB (Kinetic Intelligent Machine LAB)2021-03-24 | Self-Supervised Motion Retargeting with Safety Guarantee
by Sungjoon Choi (Korea University), Min Jae Song (NYU), Hyemin Ahn (TUM), Joohyung Kim (UIUC)
In this paper, we present self-supervised shared latent embedding (S3LE), a data-driven motion retargeting method that enables the generation of natural motions in humanoid robots from motion capture data or RGB videos. While it requires paired data consisting of human poses and their corresponding robot configurations, it significantly alleviates the necessity of time-consuming data-collection via novel paired data generating processes. Our self-supervised learning procedure consists of two steps: automatically generating paired data to bootstrap the motion retargeting, and learning a projection-invariant mapping to handle the different expressivity of humans and humanoid robots. Furthermore, our method guarantees that the generated robot pose is collision-free and satisfies position limits by utilizing nonparametric regression in the shared latent space. We demonstrate that our method can generate expressive robotic motions from both the CMU motion capture database and YouTube videos.Two-Stage Trajectory Optimization for Flapping Flight with Data-Driven ModelsKIMLAB (Kinetic Intelligent Machine LAB)2021-03-23 | Two-Stage Trajectory Optimization for Flapping Flight with Data-Driven Models
by Jonathan Hoff and Joohyung Kim
Underactuated robots often require involved routines for trajectory planning due to their complex dynamics. Flapping-wing aerial vehicles have unsteady aerodynamics and periodic gaits that complicate the planning procedure. In this paper, we improve upon existing methods for flight planning by introducing a two-stage optimization routine to plan flapping flight trajectories. The first stage solves a trajectory optimization problem with a data-driven fixed-wing approximation model trained with experimental flight data. The solution to this is used as the initial guess for a second stage optimization using a flapping-wing model trained with the same flight data. We demonstrate the effectiveness of this approach with a bat robot in both simulation and experimental flight results. The speed of convergence, the dependency on the initial guess, and the quality of the solution are improved, and the robot is able to track the optimized trajectory of a dive maneuver.Demonstration of Snapbot V2KIMLAB (Kinetic Intelligent Machine LAB)2021-02-25 | This is a video for the 16th CSL Student Conference 2021.
Robotics Session Thursday, February 25 12:00 – 14:00Assembly Instruction for KIMLABs ROBOTIS MINI Upgrade KitKIMLAB (Kinetic Intelligent Machine LAB)2020-11-11 | Assembly Instruction for KIMLAB's ROBOTIS MINI Upgrade KitSnapbot V2: a Reconfigurable Legged Robot with a Camera for Self Configuration RecognitionKIMLAB (Kinetic Intelligent Machine LAB)2020-10-30 | We present a reconfigurable modular legged robot, Snapbot V2. The mechanical design of Snapbot V2 is enhanced for better dynamic performance and robust connection with modular legs. A motion generator for locomotion is developed to achieve various locomotion skills in one to six-leg configurations. The locomotion is tested on a multi-body dynamic simulation model and implemented on a physical robot as well. A visual detection is implemented with a camera module to recognize the robot’s configuration. By detecting the particular color of the parts at the leg module, the robot can recognize the number and location of the connected legs. Based on the recognized configuration, Snapbot V2 selects the proper locomotion style automatically.