Abstract
A fundamental challenge for the visual system is to extract the 3D spatial structure of the environment. When an observer translates without moving the eyes, the retinal speed of a stationary object is related to its distance by a scale factor that depends on the velocity of the observer's self-motion. Here, we aim to test whether the brain uses vestibular cues to selfmotion to estimate distance to stationary surfaces in the environment. This relationship was systematically probed using a two-alternative forced-choice task in which distance perceived from monocular image motion during passive body translation was compared to distance perceived from binocular disparity while subjects were stationary. We show that perceived distance from motion depended on both observer velocity and retinal speed. For a given head speed, slower retinal speeds led to the perception of farther distances. Likewise, for a given retinal speed, slower head speeds led to the perception of nearer distances. However, these relationships were weak in some subjects and absent in others, and distance estimated from self-motion and retinal image motion was substantially compressed relative to distance estimated from binocular disparity. Overall, our findings suggest that the combination of retinal image motion and vestibular signals related to head velocity can provide a rudimentary capacity for distance estimation.
Original language | English (US) |
---|---|
Article number | 2 |
Journal | Journal of vision |
Volume | 11 |
Issue number | 13 |
DOIs | |
State | Published - 2011 |
Keywords
- Absolute distance
- Binocular disparity
- Depth perception
- Distance scaling
- Motion parallax
- Optic flow
- Vestibular
- Visual motion
ASJC Scopus subject areas
- Ophthalmology
- Sensory Systems