TY - GEN
T1 - Relative Deviation Margin Bounds
AU - Cortes, Corinna
AU - Mohri, Mehryar
AU - Suresh, Ananda Theertha
N1 - Publisher Copyright:
Copyright © 2021 by the author(s)
PY - 2021
Y1 - 2021
N2 - We present a series of new and more favorable margin-based learning guarantees that depend on the empirical margin loss of a predictor. We give two types of learning bounds, in terms of either the Rademacher complexity or the empirical ℓ∞- covering number of the hypothesis set used, both distribution-dependent and valid for general families. Furthermore, using our relative deviation margin bounds, we derive distribution-dependent generalization bounds for unbounded loss functions under the assumption of a finite moment. We also briefly highlight several applications of these bounds and discuss their connection with existing results.
AB - We present a series of new and more favorable margin-based learning guarantees that depend on the empirical margin loss of a predictor. We give two types of learning bounds, in terms of either the Rademacher complexity or the empirical ℓ∞- covering number of the hypothesis set used, both distribution-dependent and valid for general families. Furthermore, using our relative deviation margin bounds, we derive distribution-dependent generalization bounds for unbounded loss functions under the assumption of a finite moment. We also briefly highlight several applications of these bounds and discuss their connection with existing results.
UR - http://www.scopus.com/inward/record.url?scp=85129255819&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85129255819&partnerID=8YFLogxK
M3 - Conference contribution
AN - SCOPUS:85129255819
T3 - Proceedings of Machine Learning Research
SP - 2122
EP - 2131
BT - Proceedings of the 38th International Conference on Machine Learning, ICML 2021
PB - ML Research Press
T2 - 38th International Conference on Machine Learning, ICML 2021
Y2 - 18 July 2021 through 24 July 2021
ER -