TY - GEN
T1 - May the Force Be with You
T2 - 23rd IEEE International Symposium on Mixed and Augmented Reality, ISMAR 2024
AU - Zhang, Fengze
AU - Zhang, Yunxiang
AU - Peng, Xi
AU - Achitoff, Sky
AU - Torrens, Paul M.
AU - Sun, Qi
N1 - Publisher Copyright:
© 2024 IEEE.
PY - 2024
Y1 - 2024
N2 - Advances in virtual reality (VR) have reduced experience differentials for users. However, gaps between reality and virtuality persist in tasks that require coupling users' multimodal physical skills with virtual environments in delicate ways. User embodiment in VR easily breaks when physicality feels inauthentic, especially when users invoke their innate predilection to touch and manipulate things that they encounter. In this research, we examine the potential of forceaware VR interfaces for enabling natural connections to user physicality and evaluate them in high-finesse cases of touch. Combining surface electromyography (SEMG) with visual tracking, we develop an end-to-end learning-based system, ForceSense, to decode users' dexterous finger forces from their forearm sEMG signals for direct usage in standard VR pipelines. This approach eliminates the need for hand-held tactile equipment, thereby promoting natural embodiment. A series of user studies on manipulation tasks in VR validate that ForceSense is more accurate, robust, and intuitive than alternative solutions. Two proofs-of-concept VR applications, calligraphy and piano playing, demonstrate that the good synergy between visual, auditory, and tactile modalities, as ForceSense affords, has the potential of enhancing users' task learning performance in VR. Our source code and trained models are released at https://github.com/NYU-ICL/vr-force-aware-multimodal-interface.
AB - Advances in virtual reality (VR) have reduced experience differentials for users. However, gaps between reality and virtuality persist in tasks that require coupling users' multimodal physical skills with virtual environments in delicate ways. User embodiment in VR easily breaks when physicality feels inauthentic, especially when users invoke their innate predilection to touch and manipulate things that they encounter. In this research, we examine the potential of forceaware VR interfaces for enabling natural connections to user physicality and evaluate them in high-finesse cases of touch. Combining surface electromyography (SEMG) with visual tracking, we develop an end-to-end learning-based system, ForceSense, to decode users' dexterous finger forces from their forearm sEMG signals for direct usage in standard VR pipelines. This approach eliminates the need for hand-held tactile equipment, thereby promoting natural embodiment. A series of user studies on manipulation tasks in VR validate that ForceSense is more accurate, robust, and intuitive than alternative solutions. Two proofs-of-concept VR applications, calligraphy and piano playing, demonstrate that the good synergy between visual, auditory, and tactile modalities, as ForceSense affords, has the potential of enhancing users' task learning performance in VR. Our source code and trained models are released at https://github.com/NYU-ICL/vr-force-aware-multimodal-interface.
KW - Force sensing
KW - sEMG learning
KW - tactile perception
UR - http://www.scopus.com/inward/record.url?scp=85211305009&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85211305009&partnerID=8YFLogxK
U2 - 10.1109/ISMAR62088.2024.00060
DO - 10.1109/ISMAR62088.2024.00060
M3 - Conference contribution
AN - SCOPUS:85211305009
T3 - Proceedings - 2024 IEEE International Symposium on Mixed and Augmented Reality, ISMAR 2024
SP - 455
EP - 464
BT - Proceedings - 2024 IEEE International Symposium on Mixed and Augmented Reality, ISMAR 2024
A2 - Eck, Ulrich
A2 - Sra, Misha
A2 - Stefanucci, Jeanine
A2 - Sugimoto, Maki
A2 - Tatzgern, Markus
A2 - Williams, Ian
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 21 October 2024 through 25 October 2024
ER -