Realizing mixed-reality environments with tablets for intuitive human-robot collaboration for object manipulation tasks

Jared A. Frank, Matthew Moorhead, Vikram Kapila

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Although gesture-based input and augmented reality (AR) facilitate intuitive human-robot interactions (HRI), prior implementations have relied on research-grade hardware and software. This paper explores using tablets to render mixed-reality visual environments that support human-robot collaboration for object manipulation. A mobile interface is created on a tablet by integrating real-time vision, 3D graphics, touchscreen interaction, and wireless communication. This mobile interface augments a live video of physical objects in a robot's workspace with corresponding virtual objects that can be manipulated by a user to intuitively command the robot to manipulate the physical objects. By generating the mixed-reality environment on an exocentric view provided by the tablet camera, the interface establishes a common frame of reference for the user and the robot to effectively communicate spatial information for object manipulation. After addressing challenges due to limitations in mobile sensing and computation, the interface is evaluated with participants to examine the performance and user experience with the suggested approach.

Original languageEnglish (US)
Title of host publication25th IEEE International Symposium on Robot and Human Interactive Communication, RO-MAN 2016
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages302-307
Number of pages6
ISBN (Electronic)9781509039296
DOIs
StatePublished - Nov 15 2016
Event25th IEEE International Symposium on Robot and Human Interactive Communication, RO-MAN 2016 - New York, United States
Duration: Aug 26 2016Aug 31 2016

Publication series

Name25th IEEE International Symposium on Robot and Human Interactive Communication, RO-MAN 2016

Other

Other25th IEEE International Symposium on Robot and Human Interactive Communication, RO-MAN 2016
Country/TerritoryUnited States
CityNew York
Period8/26/168/31/16

ASJC Scopus subject areas

  • Artificial Intelligence
  • Social Psychology
  • Human-Computer Interaction

Fingerprint

Dive into the research topics of 'Realizing mixed-reality environments with tablets for intuitive human-robot collaboration for object manipulation tasks'. Together they form a unique fingerprint.

Cite this