TY - GEN
T1 - Accelerating Saccadic Response through Spatial and Temporal Cross-Modal Misalignments
AU - Jiménez Navarro, Daniel
AU - Peng, Xi
AU - Zhang, Yunxiang
AU - Myszkowski, Karol
AU - Seidel, Hans Peter
AU - Sun, Qi
AU - Serrano, Ana
N1 - Publisher Copyright:
© 2024 Owner/Author.
PY - 2024/7/13
Y1 - 2024/7/13
N2 - Human senses and perception are our mechanisms to explore the external world. In this context, visual saccades -rapid and coordinated eye movements- serve as a primary tool for awareness of our surroundings. Typically, our perception is not limited to visual stimuli alone but is enriched by cross-modal interactions, such as the combination of sight and hearing. In this work, we investigate the temporal and spatial relationship of these interactions, focusing on how auditory cues that precede visual stimuli influence saccadic latency -the time that it takes for the eyes to react and start moving towards a visual target. Our research, conducted within a virtual reality environment, reveals that auditory cues preceding visual information can significantly accelerate saccadic responses, but this effect plateaus beyond certain temporal thresholds. Additionally, while the spatial positioning of visual stimuli influences the speed of these eye movements, as reported in previous research, we find that the location of auditory cues with respect to their corresponding visual stimulus does not have a comparable effect. To validate our findings, we implement two practical applications: first, a basketball training task set in a more realistic environment with complex audiovisual signals, and second, an interactive farm game that explores previously untested values of our key factors. Lastly, we discuss various potential applications where our model could be beneficial.
AB - Human senses and perception are our mechanisms to explore the external world. In this context, visual saccades -rapid and coordinated eye movements- serve as a primary tool for awareness of our surroundings. Typically, our perception is not limited to visual stimuli alone but is enriched by cross-modal interactions, such as the combination of sight and hearing. In this work, we investigate the temporal and spatial relationship of these interactions, focusing on how auditory cues that precede visual stimuli influence saccadic latency -the time that it takes for the eyes to react and start moving towards a visual target. Our research, conducted within a virtual reality environment, reveals that auditory cues preceding visual information can significantly accelerate saccadic responses, but this effect plateaus beyond certain temporal thresholds. Additionally, while the spatial positioning of visual stimuli influences the speed of these eye movements, as reported in previous research, we find that the location of auditory cues with respect to their corresponding visual stimulus does not have a comparable effect. To validate our findings, we implement two practical applications: first, a basketball training task set in a more realistic environment with complex audiovisual signals, and second, an interactive farm game that explores previously untested values of our key factors. Lastly, we discuss various potential applications where our model could be beneficial.
KW - Audiovisual integration
KW - cross-modal interactions
KW - multisensory perception
KW - saccadic latency
KW - virtual reality
UR - http://www.scopus.com/inward/record.url?scp=85199898142&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85199898142&partnerID=8YFLogxK
U2 - 10.1145/3641519.3657432
DO - 10.1145/3641519.3657432
M3 - Conference contribution
AN - SCOPUS:85199898142
T3 - Proceedings - SIGGRAPH 2024 Conference Papers
BT - Proceedings - SIGGRAPH 2024 Conference Papers
A2 - Spencer, Stephen N.
PB - Association for Computing Machinery, Inc
T2 - SIGGRAPH 2024 Conference Papers
Y2 - 28 July 2024 through 1 August 2024
ER -