Abstract
We sought to determine how local and global features within an image interact by examining whether orientation discrimination thresholds could be modified by contextual information. In particular, we investigated how local orientation signals within an image are pooled together, and whether this pooling process is dependent on the global orientation content present in the image. We find that observers' orientation judgments depend on surround contextual information, with performance being optimal when the center and surround stimuli are clearly distinct. In cases where the center and surround were not clearly segregated, we report two sets of results. If there was an ambiguity regarding the perception of a global structure (i.e. a small mismatch between local cues), observers' performance was impaired. If there was no mismatch and local and global cues were consistent with the perception of a single surface, observers performed as well as in the distinct surfaces case. Although some of our results can be largely accounted for by interactions between differently oriented filters, other aspects are more difficult to reconcile with this explanation. We suggest that low level filtering constrains observers' performance, and that influences arising from image segmentation modify how local orientation signals are pooled together.
Original language | English (US) |
---|---|
Pages (from-to) | 1915-1930 |
Number of pages | 16 |
Journal | Vision research |
Volume | 41 |
Issue number | 15 |
DOIs | |
State | Published - 2001 |
Keywords
- Contextual influences
- Cues
- Orientation discrimination
ASJC Scopus subject areas
- Ophthalmology
- Sensory Systems