Deciding where to look based on visual, auditory, and semantic information

Kyeong Jin Tark, Clayton E. Curtis

Research output: Contribution to journalArticlepeer-review

Abstract

Neurons in the dorsal frontal and parietal cortex are thought to transform incoming visual signals into the spatial goals of saccades, a process known as target selection. Here, we used functional magnetic resonance imaging (fMRI) to test how target selection may generalize beyond visual transformations when auditory and semantic information is used for selection. We compared activity in the frontal and parietal cortex when subjects made visually, aurally, and semantically guided saccades to one of four differently colored dots. Selection was based on a visual cue (i.e., one of the dots blinked), an auditory cue (i.e., a white noise burst was emitted at one of the dots' location), or a semantic cue (i.e., the color of one of the dots was spoken). Although neural responses in frontal and parietal cortex were robust, they were non-specific with regard to the type of information used for target selection. Decoders, however, trained on the patterns of activity in the intraparietal sulcus could classify both the type of cue used for target selection and the direction of the saccade. Therefore, we find evidence that the posterior parietal cortex is involved in transforming multimodal inputs into general spatial representations that can be used to guide saccades.

Original languageEnglish (US)
Pages (from-to)26-38
Number of pages13
JournalBrain Research
Volume1525
DOIs
StatePublished - Aug 7 2013

Keywords

  • Auditory
  • Saccade
  • Semantic
  • Spatial cognition
  • fMRI

ASJC Scopus subject areas

  • General Neuroscience
  • Molecular Biology
  • Clinical Neurology
  • Developmental Biology

Fingerprint

Dive into the research topics of 'Deciding where to look based on visual, auditory, and semantic information'. Together they form a unique fingerprint.

Cite this