Geometric Lower Bounds for Distributed Parameter Estimation under Communication Constraints

Yanjun Han, Ayfer Özgür, Tsachy Weissman

Research output: Contribution to journalConference articlepeer-review

Abstract

We consider parameter estimation in distributed networks, where each sensor in the network observes an independent sample from an underlying distribution and has k bits to communicate its sample to a centralized processor which computes an estimate of a desired parameter. We develop lower bounds for the minimax risk of estimating the underlying parameter under squared ℓ2 loss for a large class of distributions. Our results show that under mild regularity conditions, the communication constraint reduces the effective sample size by a factor of d when k is small, where d is the dimension of the estimated parameter. Furthermore, this penalty reduces at most exponentially with increasing k, which is the case for some models, e.g., estimating high-dimensional distributions. For other models however, we show that the sample size reduction is re-mediated only linearly with increasing k, e.g. when some sub-Gaussian structure is available. We apply our results to the distributed setting with product Bernoulli model, multinomial model, and dense/sparse Gaussian location models which recover or strengthen existing results. Our approach significantly deviates from existing approaches for developing information-theoretic lower bounds for communication-efficient estimation. We circumvent the need for strong data processing inequalities used in prior work and develop a geometric approach which builds on a new representation of the communication constraint. This approach allows us to strengthen and generalize existing results with simpler and more transparent proofs.

Original languageEnglish (US)
Pages (from-to)3163-3188
Number of pages26
JournalProceedings of Machine Learning Research
Volume75
StatePublished - 2018
Event31st Annual Conference on Learning Theory, COLT 2018 - Stockholm, Sweden
Duration: Jul 6 2018Jul 9 2018

Keywords

  • Blackboard communication protocol
  • Distributed estimation
  • High-dimensional geometry
  • Minimax lower bound
  • Strong data processing inequality

ASJC Scopus subject areas

  • Artificial Intelligence
  • Software
  • Control and Systems Engineering
  • Statistics and Probability

Fingerprint

Dive into the research topics of 'Geometric Lower Bounds for Distributed Parameter Estimation under Communication Constraints'. Together they form a unique fingerprint.

Cite this