TY - JOUR
T1 - HuBar
T2 - A Visual Analytics Tool to Explore Human Behavior based on fNIRS in AR Guidance Systems
AU - Castelo, Sonia
AU - Rulff, Joao
AU - Solunke, Parikshit
AU - McGowan, Erin
AU - Wu, Guande
AU - Roman, Iran
AU - Lopez, Roque
AU - Steers, Bea
AU - Sun, Qi
AU - Bello, Juan
AU - Feest, Bradley
AU - Middleton, Michael
AU - McKendrick, Ryan
AU - Silva, Claudio
N1 - Publisher Copyright:
© 1995-2012 IEEE.
PY - 2024
Y1 - 2024
N2 - The concept of an intelligent augmented reality (AR) assistant has significant, wide-ranging applications, with potential uses in medicine, military, and mechanics domains. Such an assistant must be able to perceive the environment and actions, reason about the environment state in relation to a given task, and seamlessly interact with the task performer. These interactions typically involve an AR headset equipped with sensors which capture video, audio, and haptic feedback. Previous works have sought to facilitate the development of intelligent AR assistants by visualizing these sensor data streams in conjunction with the assistant's perception and reasoning model outputs. However, existing visual analytics systems do not focus on user modeling or include biometric data, and are only capable of visualizing a single task session for a single performer at a time. Moreover, they typically assume a task involves linear progression from one step to the next. We propose a visual analytics system that allows users to compare performance during multiple task sessions, focusing on non-linear tasks where different step sequences can lead to success. In particular, we design visualizations for understanding user behavior through functional near-infrared spectroscopy (fNIRS) data as a proxy for perception, attention, and memory as well as corresponding motion data (acceleration, angular velocity, and gaze). We distill these insights into embedding representations that allow users to easily select groups of sessions with similar behaviors. We provide two case studies that demonstrate how to use these visualizations to gain insights about task performance using data collected during helicopter copilot training tasks. Finally, we evaluate our approach by conducting an in-depth examination of a think-aloud experiment with five domain experts.
AB - The concept of an intelligent augmented reality (AR) assistant has significant, wide-ranging applications, with potential uses in medicine, military, and mechanics domains. Such an assistant must be able to perceive the environment and actions, reason about the environment state in relation to a given task, and seamlessly interact with the task performer. These interactions typically involve an AR headset equipped with sensors which capture video, audio, and haptic feedback. Previous works have sought to facilitate the development of intelligent AR assistants by visualizing these sensor data streams in conjunction with the assistant's perception and reasoning model outputs. However, existing visual analytics systems do not focus on user modeling or include biometric data, and are only capable of visualizing a single task session for a single performer at a time. Moreover, they typically assume a task involves linear progression from one step to the next. We propose a visual analytics system that allows users to compare performance during multiple task sessions, focusing on non-linear tasks where different step sequences can lead to success. In particular, we design visualizations for understanding user behavior through functional near-infrared spectroscopy (fNIRS) data as a proxy for perception, attention, and memory as well as corresponding motion data (acceleration, angular velocity, and gaze). We distill these insights into embedding representations that allow users to easily select groups of sessions with similar behaviors. We provide two case studies that demonstrate how to use these visualizations to gain insights about task performance using data collected during helicopter copilot training tasks. Finally, we evaluate our approach by conducting an in-depth examination of a think-aloud experiment with five domain experts.
KW - Application Motivated Visualization
KW - AR/VR/Immersive
KW - Image and Video Data
KW - Mobile
KW - Perception & Cognition
KW - Specialized Input/Display Hardware
KW - Temporal Data
UR - http://www.scopus.com/inward/record.url?scp=85203797716&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85203797716&partnerID=8YFLogxK
U2 - 10.1109/TVCG.2024.3456388
DO - 10.1109/TVCG.2024.3456388
M3 - Article
AN - SCOPUS:85203797716
SN - 1077-2626
JO - IEEE Transactions on Visualization and Computer Graphics
JF - IEEE Transactions on Visualization and Computer Graphics
ER -