Abstract
People make surprising but reliable perceptual errors. Here, we provide a unified explanation for systematic errors in the perception of three-dimensional (3-D) motion. To do so, we characterized the binocular retinal motion signals produced by objects moving through arbitrary locations in 3-D. Next, we developed a Bayesian model, treating 3-D motion perception as optimal inference given sensory noise in the measurement of retinal motion. The model predicts a set of systematic perceptual errors, which depend on stimulus distance, contrast, and eccentricity. We then used a virtual-reality headset as well as a standard 3-D desktop stereoscopic display to test these predictions in a series of perceptual experiments. As predicted, we found evidence that errors in 3-D motion perception depend on the contrast, viewing distance, and eccentricity of a stimulus. These errors include a lateral bias in perceived motion direction and a surprising tendency to misreport approaching motion as receding and vice versa. In sum, we present a Bayesian model that provides a parsimonious account for a range of systematic misperceptions of motion in naturalistic environments.
Original language | English (US) |
---|---|
Article number | 23 |
Pages (from-to) | 1-23 |
Number of pages | 23 |
Journal | Journal of vision |
Volume | 18 |
Issue number | 3 |
DOIs | |
State | Published - 2018 |
Keywords
- Computational modeling
- Motion perception
- Motion-3D
- Virtual reality
ASJC Scopus subject areas
- Ophthalmology
- Sensory Systems