H-Consistency Guarantees for Regression

Anqi Mao, Mehryar Mohri, Yutao Zhong

Research output: Contribution to journalConference articlepeer-review

Abstract

We present a detailed study of H-consistency bounds for regression. We first present new theorems that generalize the tools previously given to establish H-consistency bounds. This generalization proves essential for analyzing H-consistency bounds specific to regression. Next, we prove a series of novel H-consistency bounds for surrogate loss functions of the squared loss, under the assumption of a symmetric distribution and a bounded hypothesis set. This includes positive results for the Huber loss, all `p losses, p ≥ 1, the squared є-insensitive loss, as well as a negative result for the є-insensitive loss used in Support Vector Regression (SVR). We further leverage our analysis of H-consistency for regression and derive principled surrogate losses for adversarial regression (Section 5). This readily establishes novel algorithms for adversarial regression, for which we report favorable experimental results in Section 6.

Original languageEnglish (US)
Pages (from-to)34712-34737
Number of pages26
JournalProceedings of Machine Learning Research
Volume235
StatePublished - 2024
Event41st International Conference on Machine Learning, ICML 2024 - Vienna, Austria
Duration: Jul 21 2024Jul 27 2024

ASJC Scopus subject areas

  • Artificial Intelligence
  • Software
  • Control and Systems Engineering
  • Statistics and Probability

Fingerprint

Dive into the research topics of 'H-Consistency Guarantees for Regression'. Together they form a unique fingerprint.

Cite this