Abstract
Although many high-level perceptual tasks can be achieved on the basis of information encoded through one sensory modality, it is increasingly evident that the maintenance of a robust, coherent perception of the objects that surround us depends on multisensory integration. Consequently, multisensory representations of object information in memory, particularly those based on vision and touch, result in more efficient object recognition and spatial localisation. The following chapter reviews evidence on how multisensory object information can, for example, resolve problems often associated with unisensory processing such as maintaining shape constancy with changes in object viewpoint or motion, and updating spatial representations with changes in observer position. Further evidence from neuroimaging studies suggests that the perceptual processes involved in object and spatial recognition are underpinned by shared neural resources. Taken together, these studies suggest that the traditional view of sensory systems processing object information in an independent manner is breaking down such that, conversely, the wealth of evidence now lies firmly in favour of sensory systems which are highly interactive all along the information processing hierarchy, and which can modulate and affect high-level perceptual outcomes.
Original language | English (US) |
---|---|
Title of host publication | Multisensory Object Perception in the Primate Brain |
Publisher | Springer New York |
Pages | 251-271 |
Number of pages | 21 |
ISBN (Electronic) | 9781441956156 |
ISBN (Print) | 9781441956149 |
DOIs | |
State | Published - 2010 |
ASJC Scopus subject areas
- General Medicine
- General Neuroscience