A general framework for inferring Bayesian ideal observer models from psychophysical data

Tyler S. Manning, Benjamin N. Naecker, Iona R. McLean, Bas Rokers, Jonathan W. Pillow, Emily A. Cooper

Research output: Contribution to journalArticlepeer-review


A central question in neuroscience is how sensory inputs are transformed into percepts. At this point, it is clear that this process is strongly influenced by prior knowledge of the sensory environment. Bayesian ideal observer models provide a key link between data and theory that can help researchers evaluate how prior knowledge is represented and integrated with incoming sensory information. However, the statistical prior employed by a Bayesian observer cannot be measured directly, and must instead be inferred from behavioral measurements. Here we review the general problem of inferring priors from psychophysical data, and the simple solution that follows from assuming a prior that is a Gaussian probability distribution. As our understanding of sensory processing advances, however, there is an increasing need for methods to flexibly recover the shape of Bayesian priors that are not well-approximated by elementary functions. To address this issue, we describe a novel approach that applies to arbitrary prior shapes, which we parameterize using mixtures of Gaussian distributions. After incorporating a simple approximation, this method produces an analytical solution for psychophysical quantities that can be numerically optimized to recover the shapes of Bayesian priors. This approach offers advantages in flexibility, while still providing an analytical framework for many scenarios. We provide a MATLAB toolbox implementing key computations described herein.

Original languageEnglish (US)
Issue number1
StatePublished - Jan 2023


  • Bayesian Inference
  • Ideal Observer Models
  • Perception

ASJC Scopus subject areas

  • General Neuroscience


Dive into the research topics of 'A general framework for inferring Bayesian ideal observer models from psychophysical data'. Together they form a unique fingerprint.

Cite this