TY - GEN
T1 - An Augmented Reality Interface for Human-Robot Interaction in Unconstrained Environments
AU - Chacko, Sonia Mary
AU - Kapila, Vikram
N1 - Funding Information:
Work supported in part by the National Science Foundation under DRK-12 grant DRL-1417769, ITEST grant DRL-1614085, and RET Site grant EEC-1542286, and NY Space Grant Consortium grant 48240-7887.
Publisher Copyright:
© 2019 IEEE.
PY - 2019/11
Y1 - 2019/11
N2 - As robots start to become ubiquitous in the personal workspace, it is necessary to have simple and intuitive interfaces to interact with them. In this paper, we propose an augmented reality (AR) interface for human-robot interaction (HRI) in a shared working environment. By fusing marker-based and markerless AR technologies, a mobile AR interface is created that enables a smartphone to detect planar surfaces and localize a manipulator robot in its working environment while obviating the need for a controlled or constrained environment. The AR interface and robot manipulator are integrated to render a system that enables users to perform pick-and-place task effortlessly. Specifically, a smartphone-based AR application is developed that allows a user to select any location within the robot's workspace by merely touching on the smartphone screen. Virtual objects, rendered at user-selected locations, are used to determine the pick and place locations of objects in the real world. The virtual object's start and end points, originally specified in the smartphone camera coordinate frame, are transformed into the robot coordinate frame for the robot manipulator to autonomously perform the assigned task. A user study is conducted with participants to evaluate the system performance and user experience. The results show that the proposed AR interface is user-friendly and intuitive to operate the robot, and it allows users to communicate their intentions through the virtual object easily.
AB - As robots start to become ubiquitous in the personal workspace, it is necessary to have simple and intuitive interfaces to interact with them. In this paper, we propose an augmented reality (AR) interface for human-robot interaction (HRI) in a shared working environment. By fusing marker-based and markerless AR technologies, a mobile AR interface is created that enables a smartphone to detect planar surfaces and localize a manipulator robot in its working environment while obviating the need for a controlled or constrained environment. The AR interface and robot manipulator are integrated to render a system that enables users to perform pick-and-place task effortlessly. Specifically, a smartphone-based AR application is developed that allows a user to select any location within the robot's workspace by merely touching on the smartphone screen. Virtual objects, rendered at user-selected locations, are used to determine the pick and place locations of objects in the real world. The virtual object's start and end points, originally specified in the smartphone camera coordinate frame, are transformed into the robot coordinate frame for the robot manipulator to autonomously perform the assigned task. A user study is conducted with participants to evaluate the system performance and user experience. The results show that the proposed AR interface is user-friendly and intuitive to operate the robot, and it allows users to communicate their intentions through the virtual object easily.
UR - http://www.scopus.com/inward/record.url?scp=85080440763&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85080440763&partnerID=8YFLogxK
U2 - 10.1109/IROS40897.2019.8967973
DO - 10.1109/IROS40897.2019.8967973
M3 - Conference contribution
AN - SCOPUS:85080440763
T3 - IEEE International Conference on Intelligent Robots and Systems
SP - 3222
EP - 3228
BT - 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2019
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2019
Y2 - 3 November 2019 through 8 November 2019
ER -