Abstract
The moment-entropy inequality shows that a continuous random variable with given second moment and maximal Shannon entropy must be Gaussian. Stam's inequality shows that a continuous random variable with given Fisher information and minimal Shannon entropy must also be Gaussian. The Cramér-Rao inequality is a direct consequence of these two inequalities. In this paper, the inequalities above are extended to Ren yi entropy, pth moment, and generalized Fisher information. Generalized Gaussian random densities are introduced and shown to be the extremal densities for the new inequalities. An extension of the Cramér-Rao inequality is derived as a consequence of these moment and Fisher information inequalities.
Original language | English (US) |
---|---|
Pages (from-to) | 473-478 |
Number of pages | 6 |
Journal | IEEE Transactions on Information Theory |
Volume | 51 |
Issue number | 2 |
DOIs | |
State | Published - Feb 2005 |
Keywords
- Entropy
- Fisher information
- Information measure
- Information theory
- Moment
- Renyi entropy
ASJC Scopus subject areas
- Information Systems
- Computer Science Applications
- Library and Information Sciences