A novel role for visual perspective cues in the neural computation of depth

Hyunggoo R. Kim, Dora E. Angelaki, Gregory C. Deangelis

Research output: Contribution to journalReview articlepeer-review


As we explore a scene, our eye movements add global patterns of motion to the retinal image, complicating visual motion produced by self-motion or moving objects. Conventionally, it has been assumed that extraretinal signals, such as efference copy of smooth pursuit commands, are required to compensate for the visual consequences of eye rotations. We consider an alternative possibility: namely, that the visual system can infer eye rotations from global patterns of image motion. We visually simulated combinations of eye translation and rotation, including perspective distortions that change dynamically over time. We found that incorporating these 'dynamic perspective' cues allowed the visual system to generate selectivity for depth sign from motion parallax in macaque cortical area MT, a computation that was previously thought to require extraretinal signals regarding eye velocity. Our findings suggest neural mechanisms that analyze global patterns of visual motion to perform computations that require knowledge of eye rotations.

Original languageEnglish (US)
Pages (from-to)129-137
Number of pages9
JournalNature Neuroscience
Issue number1
StatePublished - Jan 1 2015

ASJC Scopus subject areas

  • General Neuroscience


Dive into the research topics of 'A novel role for visual perspective cues in the neural computation of depth'. Together they form a unique fingerprint.

Cite this