Abstract
This paper presents a novel multi-modality audio system designed for a virtual reality (VR) balance performance assessment application, examining how headphone and loudspeaker reproduction of auditory cues impacted dynamic balance and motor control in healthy young adults. Participants in the simulation were instructed to dodge incoming balls while reacting to various visual cues, facing cognitive challenges, and being exposed to different auditory conditions via headphones or loudspeakers. The experiment was divided into three parts, each incorporating various sound designs: synchronized foreground ball sounds, background distracting sounds, and directional notification sounds. These sounds were delivered through four modalities: Headphones, Loudspeakers, Room Simulation (headphones with room acoustic simulation), and Passthrough (loudspeakers with headphones worn). This paper provides a comprehensive description of the audio system’s architecture, technical implementation, and sound design choices. The findings offer valuable insights into designing audio systems for VR-based physical therapy and balance assessment, supporting the future development of applications ranging from user-friendly home setups to advanced clinical tools for analyzing and rehabilitating human movement and balance.
Original language | English (US) |
---|---|
State | Published - 2024 |
Event | 5th AES International Conference on Audio for Virtual and Augmented Reality, AVAR 2024 - Redmond, United States Duration: Aug 19 2024 → Aug 21 2024 |
Conference
Conference | 5th AES International Conference on Audio for Virtual and Augmented Reality, AVAR 2024 |
---|---|
Country/Territory | United States |
City | Redmond |
Period | 8/19/24 → 8/21/24 |
ASJC Scopus subject areas
- Electrical and Electronic Engineering
- Acoustics and Ultrasonics