Testing multi-scale processing in the auditory system

Xiangbin Teng, Xing Tian, David Poeppel

Research output: Contribution to journalArticlepeer-review


Natural sounds contain information on multiple timescales, so the auditory system must analyze and integrate acoustic information on those different scales to extract behaviorally relevant information. However, this multi-scale process in the auditory system is not widely investigated in the literature, and existing models of temporal integration are mainly built upon detection or recognition tasks on a single timescale. Here we use a paradigm requiring processing on relatively 'local' and 'global' scales and provide evidence suggesting that the auditory system extracts fine-detail acoustic information using short temporal windows and uses long temporal windows to abstract global acoustic patterns. Behavioral task performance that requires processing fine-detail information does not improve with longer stimulus length, contrary to predictions of previous temporal integration models such as the multiple-looks and the spectro-temporal excitation pattern model. Moreover, the perceptual construction of putatively 'unitary' auditory events requires more than hundreds of milliseconds. These findings support the hypothesis of a dual-scale processing likely implemented in the auditory cortex.

Original languageEnglish (US)
Article number34390
JournalScientific reports
StatePublished - Oct 7 2016

ASJC Scopus subject areas

  • General


Dive into the research topics of 'Testing multi-scale processing in the auditory system'. Together they form a unique fingerprint.

Cite this