TY - GEN
T1 - Mixed Reality Interface for Whole-Body Balancing and Manipulation of Humanoid Robot
AU - Song, Hyunjong
AU - Bronfman, Gabriel
AU - Zhang, Yunxiang
AU - Sun, Qi
AU - Kim, Joo H.
N1 - Publisher Copyright:
© 2024 IEEE.
PY - 2024
Y1 - 2024
N2 - The complexity of the control and operation is one of the roadblocks of widespread utilization of humanoid robots. In this study, we introduce a novel approach to humanoid robot control by leveraging a mixed reality (MR) interface for whole-body balancing and manipulation. This interface system uses an MR headset to track the operator's movement and provide the operator with useful visual information for the control. The robot mimics the operator's movement through a motion retargeting method based on linear scaling and inverse kinematics. The operator obtains visual access to the robot's perspective view augmented with fiducial detection and perceives the current stability of the robot by evaluating the robot's center-of-mass state in real-time against the precomputed balanced state basin. In experimental demonstrations, the operator successfully controlled the robot to grasp and lift an object without falling. The common issues in teleoperation with virtual reality headsets, motion sickness and unawareness of their surroundings, are reduced to a low level by using the MR headset with transparent glasses. This study demonstrates the potential of MR in teleoperation with a motion retargeting and stability monitoring method.
AB - The complexity of the control and operation is one of the roadblocks of widespread utilization of humanoid robots. In this study, we introduce a novel approach to humanoid robot control by leveraging a mixed reality (MR) interface for whole-body balancing and manipulation. This interface system uses an MR headset to track the operator's movement and provide the operator with useful visual information for the control. The robot mimics the operator's movement through a motion retargeting method based on linear scaling and inverse kinematics. The operator obtains visual access to the robot's perspective view augmented with fiducial detection and perceives the current stability of the robot by evaluating the robot's center-of-mass state in real-time against the precomputed balanced state basin. In experimental demonstrations, the operator successfully controlled the robot to grasp and lift an object without falling. The common issues in teleoperation with virtual reality headsets, motion sickness and unawareness of their surroundings, are reduced to a low level by using the MR headset with transparent glasses. This study demonstrates the potential of MR in teleoperation with a motion retargeting and stability monitoring method.
UR - http://www.scopus.com/inward/record.url?scp=85200704389&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85200704389&partnerID=8YFLogxK
U2 - 10.1109/UR61395.2024.10597520
DO - 10.1109/UR61395.2024.10597520
M3 - Conference contribution
AN - SCOPUS:85200704389
T3 - 2024 21st International Conference on Ubiquitous Robots, UR 2024
SP - 642
EP - 647
BT - 2024 21st International Conference on Ubiquitous Robots, UR 2024
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 21st International Conference on Ubiquitous Robots, UR 2024
Y2 - 24 June 2024 through 27 June 2024
ER -