TY - JOUR
T1 - Force-Aware Interface via Electromyography for Natural VR/AR Interaction
AU - Zhang, Yunxiang
AU - Liang, Benjamin
AU - Chen, Boyuan
AU - Torrens, Paul M.
AU - Atashzar, S. Farokh
AU - Lin, Dahua
AU - Sun, Qi
N1 - Publisher Copyright:
© 2022 ACM.
PY - 2022/11/30
Y1 - 2022/11/30
N2 - While tremendous advances in visual and auditory realism have been made for virtual and augmented reality (VR/AR), introducing a plausible sense of physicality into the virtual world remains challenging. Closing the gap between real-world physicality and immersive virtual experience requires a closed interaction loop: applying user-exerted physical forces to the virtual environment and generating haptic sensations back to the users. However, existing VR/AR solutions either completely ignore the force inputs from the users or rely on obtrusive sensing devices that compromise user experience. By identifying users' muscle activation patterns while engaging in VR/AR, we design a learning-based neural interface for natural and intuitive force inputs. Specifically, we show that lightweight electromyography sensors, resting non-invasively on users' forearm skin, inform and establish a robust understanding of their complex hand activities. Fuelled by a neural-network-based model, our interface can decode finger-wise forces in real-time with 3.3% mean error, and generalize to new users with little calibration. Through an interactive psychophysical study, we show that human perception of virtual objects' physical properties, such as stiffness, can be significantly enhanced by our interface. We further demonstrate that our interface enables ubiquitous control via finger tapping. Ultimately, we envision our findings to push forward research towards more realistic physicality in future VR/AR.
AB - While tremendous advances in visual and auditory realism have been made for virtual and augmented reality (VR/AR), introducing a plausible sense of physicality into the virtual world remains challenging. Closing the gap between real-world physicality and immersive virtual experience requires a closed interaction loop: applying user-exerted physical forces to the virtual environment and generating haptic sensations back to the users. However, existing VR/AR solutions either completely ignore the force inputs from the users or rely on obtrusive sensing devices that compromise user experience. By identifying users' muscle activation patterns while engaging in VR/AR, we design a learning-based neural interface for natural and intuitive force inputs. Specifically, we show that lightweight electromyography sensors, resting non-invasively on users' forearm skin, inform and establish a robust understanding of their complex hand activities. Fuelled by a neural-network-based model, our interface can decode finger-wise forces in real-time with 3.3% mean error, and generalize to new users with little calibration. Through an interactive psychophysical study, we show that human perception of virtual objects' physical properties, such as stiffness, can be significantly enhanced by our interface. We further demonstrate that our interface enables ubiquitous control via finger tapping. Ultimately, we envision our findings to push forward research towards more realistic physicality in future VR/AR.
KW - electromyography
KW - force-aware neural interface
KW - haptic perception
KW - machine learning
UR - http://www.scopus.com/inward/record.url?scp=85144255716&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85144255716&partnerID=8YFLogxK
U2 - 10.1145/3550454.3555461
DO - 10.1145/3550454.3555461
M3 - Article
AN - SCOPUS:85144255716
SN - 0730-0301
VL - 41
JO - ACM Transactions on Graphics
JF - ACM Transactions on Graphics
IS - 6
M1 - 268
ER -