Relative deviation learning bounds and generalization with unbounded loss functions

Corinna Cortes, Spencer Greenberg, Mehryar Mohri

Research output: Contribution to journalArticlepeer-review


We present an extensive analysis of relative deviation bounds, including detailed proofs of two-sided inequalities and their implications. We also give detailed proofs of two-sided generalization bounds that hold in the general case of unbounded loss functions, under the assumption that a moment of the loss is bounded. We then illustrate how to apply these results in a sample application: the analysis of importance weighting.

Original languageEnglish (US)
Pages (from-to)45-70
Number of pages26
JournalAnnals of Mathematics and Artificial Intelligence
Issue number1
StatePublished - Jan 1 2019


  • Generalization bounds
  • Importance weighting
  • Learning theory
  • Machine learning
  • Relative deviation bounds
  • Unbounded loss functions
  • Unbounded regression

ASJC Scopus subject areas

  • Artificial Intelligence
  • Applied Mathematics


Dive into the research topics of 'Relative deviation learning bounds and generalization with unbounded loss functions'. Together they form a unique fingerprint.

Cite this