Analysis and training of human sound localization behavior with VR application

Yun Han Wu, Agnieszka Roginska

Research output: Contribution to conferencePaperpeer-review

Abstract

A VR training application is built in this paper to help improve human sound localization performance on generic head-related transfer function. Subjects go through 4 different phases in the experiment, tutorial, pre-test, training and post-test, in which he or she is instructed to trigger a sound stimuli and report the perceived location by rotating their head to face the direction. The data captured automatically during each trial of the experiment includes the correct and reported position of the stimuli, reaction time and the head rotation at each 50ms. The analysis results show that there is a statistically significant improvement on subjects performance.

Original languageEnglish (US)
StatePublished - 2019
Event2019 AES International Conference on Immersive and Interactive Audio: Creating the Next Dimension of Sound Experience - York, United Kingdom
Duration: Mar 27 2019Mar 29 2019

Conference

Conference2019 AES International Conference on Immersive and Interactive Audio: Creating the Next Dimension of Sound Experience
Country/TerritoryUnited Kingdom
CityYork
Period3/27/193/29/19

ASJC Scopus subject areas

  • Electrical and Electronic Engineering
  • Acoustics and Ultrasonics

Fingerprint

Dive into the research topics of 'Analysis and training of human sound localization behavior with VR application'. Together they form a unique fingerprint.

Cite this