Abstract
A model is presented, consonant with current views regarding the neurophysiology and psychophysics of motion perception, that combines the outputs of a set of motion-sensitive spatiotemporal filters to estimate the velocity of a moving texture - without first computing component (or normal) velocity. A parallel implementation of the model encodes velocity as the peak in a distribution of velocity-sensitive units that behave much like cells of the middle temporal (MT) area of the primate brain.
Original language | English (US) |
---|---|
Title of host publication | Proceedings of the Optical Society of America Topical Meeting on Computer Vision |
Publisher | Optical Soc of America |
Pages | 151-154 |
Number of pages | 4 |
ISBN (Print) | 093665967X |
State | Published - 1987 |
ASJC Scopus subject areas
- General Engineering