Quantization effect on the log-likelihood ratio and its application to decentralized sequential detection

Yan Wang, Yajun Mei

Research output: Contribution to journalArticlepeer-review

Abstract

It is well known that quantization cannot increase the Kullback-Leibler divergence which can be thought of as the expected value or first moment of the log-likelihood ratio. In this paper, we investigate the quantization effects on the second moment of the log-likelihood ratio. It is shown via the convex domination technique that quantization may result in an increase in the case of the second moment, but the increase is bounded above by 2/e. The result is then applied to decentralized sequential detection problems not only to provide simpler sufficient conditions for asymptotic optimality theories in the simplest models, but also to shed new light on more complicated models. In addition, some brief remarks on other higher-order moments of the log-likelihood ratio are also provided.

Original languageEnglish (US)
Article number6400259
Pages (from-to)1536-1543
Number of pages8
JournalIEEE Transactions on Signal Processing
Volume61
Issue number6
DOIs
StatePublished - 2013

Keywords

  • Convex domination
  • decentralized detection
  • Kullback-Leibler
  • log-sum inequality
  • quantization
  • quickest change detection
  • sequential detection

ASJC Scopus subject areas

  • Signal Processing
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'Quantization effect on the log-likelihood ratio and its application to decentralized sequential detection'. Together they form a unique fingerprint.

Cite this