TY - JOUR
T1 - Predictions drive neural representations of visual events ahead of incoming sensory information
AU - Blom, Tessel
AU - Feuerriegel, Daniel
AU - Johnson, Philippa
AU - Bode, Stefan
AU - Hogendoorn, Hinze
N1 - Publisher Copyright:
© 2020 National Academy of Sciences. All rights reserved.
PY - 2020/3/31
Y1 - 2020/3/31
N2 - The transmission of sensory information through the visual system takes time. As a result of these delays, the visual information available to the brain always lags behind the timing of events in the present moment. Compensating for these delays is crucial for functioning within dynamic environments, since interacting with a moving object (e.g., catching a ball) requires real-time localization of the object. One way the brain might achieve this is via prediction of anticipated events. Using time-resolved decoding of electroencephalographic (EEG) data, we demonstrate that the visual system represents the anticipated future position of a moving object, showing that predictive mechanisms activate the same neural representations as afferent sensory input. Importantly, this activation is evident before sensory input corresponding to the stimulus position is able to arrive. Finally, we demonstrate that, when predicted events do not eventuate, sensory information arrives too late to prevent the visual system from representing what was expected but never presented. Taken together, we demonstrate how the visual system can implement predictive mechanisms to preactivate sensory representations, and argue that this might allow it to compensate for its own temporal constraints, allowing us to interact with dynamic visual environments in real time.
AB - The transmission of sensory information through the visual system takes time. As a result of these delays, the visual information available to the brain always lags behind the timing of events in the present moment. Compensating for these delays is crucial for functioning within dynamic environments, since interacting with a moving object (e.g., catching a ball) requires real-time localization of the object. One way the brain might achieve this is via prediction of anticipated events. Using time-resolved decoding of electroencephalographic (EEG) data, we demonstrate that the visual system represents the anticipated future position of a moving object, showing that predictive mechanisms activate the same neural representations as afferent sensory input. Importantly, this activation is evident before sensory input corresponding to the stimulus position is able to arrive. Finally, we demonstrate that, when predicted events do not eventuate, sensory information arrives too late to prevent the visual system from representing what was expected but never presented. Taken together, we demonstrate how the visual system can implement predictive mechanisms to preactivate sensory representations, and argue that this might allow it to compensate for its own temporal constraints, allowing us to interact with dynamic visual environments in real time.
KW - Neural delays
KW - Prediction
KW - Time-resolved decoding
KW - Visual system
UR - http://www.scopus.com/inward/record.url?scp=85082829164&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85082829164&partnerID=8YFLogxK
U2 - 10.1073/pnas.1917777117
DO - 10.1073/pnas.1917777117
M3 - Article
C2 - 32179666
AN - SCOPUS:85082829164
SN - 0027-8424
VL - 117
SP - 7510
EP - 7515
JO - Proceedings of the National Academy of Sciences of the United States of America
JF - Proceedings of the National Academy of Sciences of the United States of America
IS - 13
ER -