TY - GEN
T1 - Augmented Reality as a Medium for Human-Robot Collaborative Tasks
AU - Chacko, Sonia Mary
AU - Kapila, Vikram
N1 - Publisher Copyright:
© 2019 IEEE.
PY - 2019/10
Y1 - 2019/10
N2 - This paper presents a novel augmented reality (AR) interaction method that allows a robot to perform manipulation of unknown physical objects in a human-robot collaborative working environment. A mobile AR application is developed to determine and communicate, in real-time, the position, orientation, and dimension of any random object in a robot manipulator's workspace to perform pick-and-place operations. The proposed method is based on estimating the pose and size of the object by means of an AR virtual element superimposed on the live view of the real object. In particular, a semi-transparent AR element is created and manipulated through touch screen interactions to match with the pose and scale of the physical object to provide the information about that object. The resulting data is communicated to the robot manipulator to perform pick-and-place tasks. In this way, the AR virtual element acts as a medium of communication between a human and a robot. The performance of the proposed AR interface is assessed by conducting multiple trials with random objects, and it is observed that the robot successfully accomplishes tasks communicated through the AR virtual elements. The proposed interface is also tested with 20 users to determine the quality of user experience, followed by a poststudy survey. The participants reported that the AR interface is intuitive and easy to operate for manipulating physical objects of various sizes and shapes.
AB - This paper presents a novel augmented reality (AR) interaction method that allows a robot to perform manipulation of unknown physical objects in a human-robot collaborative working environment. A mobile AR application is developed to determine and communicate, in real-time, the position, orientation, and dimension of any random object in a robot manipulator's workspace to perform pick-and-place operations. The proposed method is based on estimating the pose and size of the object by means of an AR virtual element superimposed on the live view of the real object. In particular, a semi-transparent AR element is created and manipulated through touch screen interactions to match with the pose and scale of the physical object to provide the information about that object. The resulting data is communicated to the robot manipulator to perform pick-and-place tasks. In this way, the AR virtual element acts as a medium of communication between a human and a robot. The performance of the proposed AR interface is assessed by conducting multiple trials with random objects, and it is observed that the robot successfully accomplishes tasks communicated through the AR virtual elements. The proposed interface is also tested with 20 users to determine the quality of user experience, followed by a poststudy survey. The participants reported that the AR interface is intuitive and easy to operate for manipulating physical objects of various sizes and shapes.
UR - http://www.scopus.com/inward/record.url?scp=85078820570&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85078820570&partnerID=8YFLogxK
U2 - 10.1109/RO-MAN46459.2019.8956466
DO - 10.1109/RO-MAN46459.2019.8956466
M3 - Conference contribution
T3 - 2019 28th IEEE International Conference on Robot and Human Interactive Communication, RO-MAN 2019
BT - 2019 28th IEEE International Conference on Robot and Human Interactive Communication, RO-MAN 2019
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 28th IEEE International Conference on Robot and Human Interactive Communication, RO-MAN 2019
Y2 - 14 October 2019 through 18 October 2019
ER -