Machine Learning for Causal Inference

Jennifer Hill, George Perrett, Vincent Dorie

Research output: Chapter in Book/Report/Conference proceedingChapter

Abstract

Estimation of causal effects requires making comparisons across groups of observations exposed and not exposed to a treatment or cause. This chapter introduces the building blocks necessary to understand what causal quantities represent conceptually and why they are so difficult to estimate empirically. At a basic level, causal inference methods require fair comparisons. Regression provides one way to condition on confounders in an attempt to create fair comparisons. Boosted Regression Trees emerged as a way to address these issues of overfitting, difficulty in capturing additive structure, and overemphasis on high-level interactions. The mean structure of Bayesian Additive Regression Trees (BART) is the same as the boosted regression tree. The BART approach to causal inference, with its combination of flexible modeling embedded in a Bayesian likelihood framework, provides the opportunity for simultaneous inference on individual-level treatment effects as well as any of a variety of average treatment effects.

Original languageEnglish (US)
Title of host publicationHandbook of Matching and Weighting Adjustments for Causal Inference
PublisherCRC Press
Pages415-444
Number of pages30
ISBN (Electronic)9781000850819
ISBN (Print)9780367609528
DOIs
StatePublished - Jan 1 2023

ASJC Scopus subject areas

  • General Mathematics

Fingerprint

Dive into the research topics of 'Machine Learning for Causal Inference'. Together they form a unique fingerprint.

Cite this