TY - GEN
T1 - Towards teleoperation-based interactive learning of robot kinematics using a mobile augmented reality interface on a tablet
AU - Frank, Jared A.
AU - Kapila, Vikram
N1 - Publisher Copyright:
© 2016 IEEE.
PY - 2016/3/24
Y1 - 2016/3/24
N2 - The integration of augmented reality (AR) techniques in user interface design has enhanced interactive experiences in teleoperation of robots, hands-on learning in classrooms, laboratory, and special education, and user training in an array of fields, e.g., aerospace, automotive, construction, manufacturing, medical, etc. However, AR-based user interfaces that command machines and tools have not been fully explored for their potential to enhance interactive learning of engineering concepts in the laboratory. This paper outlines the development of a mobile application executing on a tablet device, which renders an immersive AR-based graphical user interface to enable users to monitor, interact with, and control a four-link underactuated planar robot. Computer vision routines are used to extract real-time, vision-based measurements of the robot's joint angles and end effector location from the live video captured by the rear-facing camera on the tablet. The obtained measurements are used to render AR content to offer users with additional visual feedback. Touch gesture recognition is implemented to allow users to naturally and intuitively command the robot by tapping and dragging their fingers at desired locations on the tablet screen. Experimental results show the performance and efficacy of the proposed system as it is operated in two different modes: one in which the user has direct control over the angles of the actuated links of the robot and one in which the user has direct control over the end effector location.
AB - The integration of augmented reality (AR) techniques in user interface design has enhanced interactive experiences in teleoperation of robots, hands-on learning in classrooms, laboratory, and special education, and user training in an array of fields, e.g., aerospace, automotive, construction, manufacturing, medical, etc. However, AR-based user interfaces that command machines and tools have not been fully explored for their potential to enhance interactive learning of engineering concepts in the laboratory. This paper outlines the development of a mobile application executing on a tablet device, which renders an immersive AR-based graphical user interface to enable users to monitor, interact with, and control a four-link underactuated planar robot. Computer vision routines are used to extract real-time, vision-based measurements of the robot's joint angles and end effector location from the live video captured by the rear-facing camera on the tablet. The obtained measurements are used to render AR content to offer users with additional visual feedback. Touch gesture recognition is implemented to allow users to naturally and intuitively command the robot by tapping and dragging their fingers at desired locations on the tablet screen. Experimental results show the performance and efficacy of the proposed system as it is operated in two different modes: one in which the user has direct control over the angles of the actuated links of the robot and one in which the user has direct control over the end effector location.
UR - http://www.scopus.com/inward/record.url?scp=84965167200&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84965167200&partnerID=8YFLogxK
U2 - 10.1109/INDIANCC.2016.7441163
DO - 10.1109/INDIANCC.2016.7441163
M3 - Conference contribution
AN - SCOPUS:84965167200
T3 - 2016 Indian Control Conference, ICC 2016 - Proceedings
SP - 385
EP - 392
BT - 2016 Indian Control Conference, ICC 2016 - Proceedings
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2nd Indian Control Conference, ICC 2016
Y2 - 4 January 2016 through 6 January 2016
ER -