TY - GEN
T1 - An augmented reality spatial referencing system for mobile robots
AU - Chacko, Sonia Mary
AU - Granado, Armando
AU - Rajkumar, Ashwin
AU - Kapila, Vikram
N1 - Funding Information:
Work supported in part by the National Science Foundation under a DRK-12 grant DRL-1417769, an ITEST grant DRL-1614085, and an RET Site grant EEC-1542286, and a NY Space Grant Consortium grant 76156-10488.
Publisher Copyright:
© 2020 IEEE.
PY - 2020/10/24
Y1 - 2020/10/24
N2 - The deployment of a mobile service robot in domestic settings is a challenging task due to the dynamic and unstructured nature of such environments. Successful operation of the robot requires continuous human supervision to update its spatial knowledge about the dynamic environment. Thus, it is essential to develop a human-robot interaction (HRI) strategy that is suitable for novice end users to effortlessly provide task-specific spatial information to the robot. Although several approaches have been developed for this purpose, most of them are not feasible or convenient for use in domestic environments. In response, we have developed an augmented reality (AR) spatial referencing system (SRS), which allows a non-expert user to tag any specific locations on a physical surface to allocate tasks to be performed by the robot at those locations. Specifically, in the AR-SRS, the user provides a spatial reference by creating an AR virtual object with a semantic label. The real-world location of the user-created virtual object is estimated and stored as spatial data along with the user-specified semantic label. We present three different approaches to establish the correspondence between the user-created virtual object locations and the real-world coordinates on an a priori static map of the service area available to the robot. The performance of each approach is evaluated and reported. We also present use-case scenarios to demonstrate potential applications of the AR-SRS for mobile service robots.
AB - The deployment of a mobile service robot in domestic settings is a challenging task due to the dynamic and unstructured nature of such environments. Successful operation of the robot requires continuous human supervision to update its spatial knowledge about the dynamic environment. Thus, it is essential to develop a human-robot interaction (HRI) strategy that is suitable for novice end users to effortlessly provide task-specific spatial information to the robot. Although several approaches have been developed for this purpose, most of them are not feasible or convenient for use in domestic environments. In response, we have developed an augmented reality (AR) spatial referencing system (SRS), which allows a non-expert user to tag any specific locations on a physical surface to allocate tasks to be performed by the robot at those locations. Specifically, in the AR-SRS, the user provides a spatial reference by creating an AR virtual object with a semantic label. The real-world location of the user-created virtual object is estimated and stored as spatial data along with the user-specified semantic label. We present three different approaches to establish the correspondence between the user-created virtual object locations and the real-world coordinates on an a priori static map of the service area available to the robot. The performance of each approach is evaluated and reported. We also present use-case scenarios to demonstrate potential applications of the AR-SRS for mobile service robots.
UR - http://www.scopus.com/inward/record.url?scp=85102397396&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85102397396&partnerID=8YFLogxK
U2 - 10.1109/IROS45743.2020.9340742
DO - 10.1109/IROS45743.2020.9340742
M3 - Conference contribution
AN - SCOPUS:85102397396
T3 - IEEE International Conference on Intelligent Robots and Systems
SP - 4446
EP - 4452
BT - 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2020
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2020
Y2 - 24 October 2020 through 24 January 2021
ER -