Abstract
It is well known that quantization cannot increase the Kullback-Leibler divergence which can be thought of as the expected value or first moment of the log-likelihood ratio. In this paper, we investigate the quantization effects on the second moment of the log-likelihood ratio. It is shown via the convex domination technique that quantization may result in an increase in the case of the second moment, but the increase is bounded above by 2/e. The result is then applied to decentralized sequential detection problems not only to provide simpler sufficient conditions for asymptotic optimality theories in the simplest models, but also to shed new light on more complicated models. In addition, some brief remarks on other higher-order moments of the log-likelihood ratio are also provided.
Original language | English (US) |
---|---|
Article number | 6400259 |
Pages (from-to) | 1536-1543 |
Number of pages | 8 |
Journal | IEEE Transactions on Signal Processing |
Volume | 61 |
Issue number | 6 |
DOIs | |
State | Published - 2013 |
Keywords
- Convex domination
- decentralized detection
- Kullback-Leibler
- log-sum inequality
- quantization
- quickest change detection
- sequential detection
ASJC Scopus subject areas
- Signal Processing
- Electrical and Electronic Engineering