Cramér-Rao and moment-entropy inequalities for Renyi entropy and generalized fisher information

Research output: Contribution to journalArticle

Abstract

The moment-entropy inequality shows that a continuous random variable with given second moment and maximal Shannon entropy must be Gaussian. Stam's inequality shows that a continuous random variable with given Fisher information and minimal Shannon entropy must also be Gaussian. The Cramér-Rao inequality is a direct consequence of these two inequalities. In this paper, the inequalities above are extended to Ren yi entropy, pth moment, and generalized Fisher information. Generalized Gaussian random densities are introduced and shown to be the extremal densities for the new inequalities. An extension of the Cramér-Rao inequality is derived as a consequence of these moment and Fisher information inequalities.

Original languageEnglish (US)
Pages (from-to)473-478
Number of pages6
JournalIEEE Transactions on Information Theory
Volume51
Issue number2
DOIs
StatePublished - Feb 2005

Keywords

  • Entropy
  • Fisher information
  • Information measure
  • Information theory
  • Moment
  • Renyi entropy

ASJC Scopus subject areas

  • Information Systems
  • Computer Science Applications
  • Library and Information Sciences

Fingerprint Dive into the research topics of 'Cramér-Rao and moment-entropy inequalities for Renyi entropy and generalized fisher information'. Together they form a unique fingerprint.

  • Cite this