Accelerating Saccadic Response through Spatial and Temporal Cross-Modal Misalignments

Daniel Jiménez Navarro, Xi Peng, Yunxiang Zhang, Karol Myszkowski, Hans Peter Seidel, Qi Sun, Ana Serrano

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

    Abstract

    Human senses and perception are our mechanisms to explore the external world. In this context, visual saccades -rapid and coordinated eye movements- serve as a primary tool for awareness of our surroundings. Typically, our perception is not limited to visual stimuli alone but is enriched by cross-modal interactions, such as the combination of sight and hearing. In this work, we investigate the temporal and spatial relationship of these interactions, focusing on how auditory cues that precede visual stimuli influence saccadic latency -the time that it takes for the eyes to react and start moving towards a visual target. Our research, conducted within a virtual reality environment, reveals that auditory cues preceding visual information can significantly accelerate saccadic responses, but this effect plateaus beyond certain temporal thresholds. Additionally, while the spatial positioning of visual stimuli influences the speed of these eye movements, as reported in previous research, we find that the location of auditory cues with respect to their corresponding visual stimulus does not have a comparable effect. To validate our findings, we implement two practical applications: first, a basketball training task set in a more realistic environment with complex audiovisual signals, and second, an interactive farm game that explores previously untested values of our key factors. Lastly, we discuss various potential applications where our model could be beneficial.

    Original languageEnglish (US)
    Title of host publicationProceedings - SIGGRAPH 2024 Conference Papers
    EditorsStephen N. Spencer
    PublisherAssociation for Computing Machinery, Inc
    ISBN (Electronic)9798400705250
    DOIs
    StatePublished - Jul 13 2024
    EventSIGGRAPH 2024 Conference Papers - Denver, United States
    Duration: Jul 28 2024Aug 1 2024

    Publication series

    NameProceedings - SIGGRAPH 2024 Conference Papers

    Conference

    ConferenceSIGGRAPH 2024 Conference Papers
    Country/TerritoryUnited States
    CityDenver
    Period7/28/248/1/24

    Keywords

    • Audiovisual integration
    • cross-modal interactions
    • multisensory perception
    • saccadic latency
    • virtual reality

    ASJC Scopus subject areas

    • Computer Vision and Pattern Recognition
    • Visual Arts and Performing Arts
    • Computer Graphics and Computer-Aided Design

    Fingerprint

    Dive into the research topics of 'Accelerating Saccadic Response through Spatial and Temporal Cross-Modal Misalignments'. Together they form a unique fingerprint.

    Cite this