Simple rules to guide expert classifications

Jongbin Jung, Connor Concannon, Ravi Shroff, Sharad Goel, Daniel G. Goldstein

Research output: Contribution to journalArticlepeer-review


Judges, doctors and managers are among those decision makers who must often choose a course of action under limited time, with limited knowledge and without the aid of a computer. Because data-driven methods typically outperform unaided judgements, resource-constrained practitioners can benefit from simple, statistically derived rules that can be applied mentally. In this work, we formalize long-standing observations about the efficacy of improper linear models to construct accurate yet easily applied rules. To test the performance of this approach, we conduct a large-scale evaluation in 22 domains and focus in detail on one: judicial decisions to release or detain defendants while they await trial. In these domains, we find that simple rules rival the accuracy of complex prediction models that base decisions on considerably more information. Further, comparing with unaided judicial decisions, we find that simple rules substantially outperform the human experts. To conclude, we present an analytical framework that sheds light on why simple rules perform as well as they do.

Original languageEnglish (US)
Pages (from-to)771-800
Number of pages30
JournalJournal of the Royal Statistical Society. Series A: Statistics in Society
Issue number3
StatePublished - Jun 1 2020


  • Heuristics
  • Judgement and decision making
  • Policy evaluation
  • Sensitivity analysis

ASJC Scopus subject areas

  • Statistics and Probability
  • Social Sciences (miscellaneous)
  • Economics and Econometrics
  • Statistics, Probability and Uncertainty


Dive into the research topics of 'Simple rules to guide expert classifications'. Together they form a unique fingerprint.

Cite this