Structured Prediction with Stronger Consistency Guarantees

Anqi Mao, Mehryar Mohri, Yutao Zhong

Research output: Contribution to journalConference articlepeer-review

Abstract

We present an extensive study of surrogate losses for structured prediction supported by H-consistency bounds. These are recently introduced guarantees that are more relevant to learning than Bayes-consistency, since they are not asymptotic and since they take into account the hypothesis set H used. We first show that no nontrivial H-consistency bound can be derived for widely used surrogate structured prediction losses. We then define several new families of surrogate losses, including structured comp-sum losses and structured constrained losses, for which we prove H-consistency bounds and thus Bayes-consistency. These loss functions readily lead to new structured prediction algorithms with stronger theoretical guarantees, based on their minimization. We describe efficient algorithms for minimizing several of these surrogate losses, including a new structured logistic loss.

Original languageEnglish (US)
JournalAdvances in Neural Information Processing Systems
Volume36
StatePublished - 2023
Event37th Conference on Neural Information Processing Systems, NeurIPS 2023 - New Orleans, United States
Duration: Dec 10 2023Dec 16 2023

ASJC Scopus subject areas

  • Computer Networks and Communications
  • Information Systems
  • Signal Processing

Fingerprint

Dive into the research topics of 'Structured Prediction with Stronger Consistency Guarantees'. Together they form a unique fingerprint.

Cite this