Abstract
We present an extensive analysis of relative deviation bounds, including detailed proofs of two-sided inequalities and their implications. We also give detailed proofs of two-sided generalization bounds that hold in the general case of unbounded loss functions, under the assumption that a moment of the loss is bounded. We then illustrate how to apply these results in a sample application: the analysis of importance weighting.
Original language | English (US) |
---|---|
Pages (from-to) | 45-70 |
Number of pages | 26 |
Journal | Annals of Mathematics and Artificial Intelligence |
Volume | 85 |
Issue number | 1 |
DOIs | |
State | Published - Jan 1 2019 |
Keywords
- Generalization bounds
- Importance weighting
- Learning theory
- Machine learning
- Relative deviation bounds
- Unbounded loss functions
- Unbounded regression
ASJC Scopus subject areas
- Applied Mathematics
- Artificial Intelligence