Abstract
A VR training application is built in this paper to help improve human sound localization performance on generic head-related transfer function. Subjects go through 4 different phases in the experiment, tutorial, pre-test, training and post-test, in which he or she is instructed to trigger a sound stimuli and report the perceived location by rotating their head to face the direction. The data captured automatically during each trial of the experiment includes the correct and reported position of the stimuli, reaction time and the head rotation at each 50ms. The analysis results show that there is a statistically significant improvement on subjects performance.
Original language | English (US) |
---|---|
State | Published - 2019 |
Event | 2019 AES International Conference on Immersive and Interactive Audio: Creating the Next Dimension of Sound Experience - York, United Kingdom Duration: Mar 27 2019 → Mar 29 2019 |
Conference
Conference | 2019 AES International Conference on Immersive and Interactive Audio: Creating the Next Dimension of Sound Experience |
---|---|
Country/Territory | United Kingdom |
City | York |
Period | 3/27/19 → 3/29/19 |
ASJC Scopus subject areas
- Electrical and Electronic Engineering
- Acoustics and Ultrasonics