Minimax estimation of divergences between discrete distributions

Yanjun Han, Jiantao Jiao, Tsachy Weissman

Research output: Contribution to journalArticlepeer-review

Abstract

We study the minimax estimation of α-divergences between discrete distributions for integer α ≥ 1, which include the Kullback-Leibler divergence and the x2-divergences as special examples. Dropping the usual theoretical tricks to acquire independence, we construct the first minimax rate-optimal estimator which does not require any Poissonization, sample splitting, or explicit construction of approximating polynomials. The estimator uses a hybrid approach which solves a problemindependent linear program based on moment matching in the non-smooth regime, and applies a problem-dependent biascorrected plug-in estimator in the smooth regime, with a soft decision boundary between these regimes.

Original languageEnglish (US)
Article number3041036
Pages (from-to)814-823
Number of pages10
JournalIEEE Journal on Selected Areas in Information Theory
Volume1
Issue number3
DOIs
StatePublished - Nov 2020

Keywords

  • Functional estimation
  • Information measures
  • Linear programming
  • Minimax estimation
  • Polynomial approximation

ASJC Scopus subject areas

  • Computer Networks and Communications
  • Media Technology
  • Artificial Intelligence
  • Applied Mathematics

Fingerprint

Dive into the research topics of 'Minimax estimation of divergences between discrete distributions'. Together they form a unique fingerprint.

Cite this