Generic Variance Bounds on Estimation and Prediction Errors in Time Series Analysis: An Entropy Perspective

Song Fang, Mikael Skoglund, Karl Henrik Johansson, Hideaki Ishii, Quanyan Zhu

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

In this paper, we obtain generic bounds on the variances of estimation and prediction errors in time series analysis via an information-theoretic approach. It is seen in general that the error bounds are determined by the conditional entropy of the data point to be estimated or predicted given the side information or past observations. Additionally, we discover that in order to achieve the prediction error bounds asymptotically, the necessary and sufficient condition is that the 'innovation' is asymptotically white Gaussian. When restricted to Gaussian processes and 1-step prediction, our bounds are shown to reduce to the Kolmogorov-Szegö formula and Wiener-Masani formula known from linear prediction theory.

Original languageEnglish (US)
Title of host publication2019 IEEE Information Theory Workshop, ITW 2019
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Electronic)9781538669006
DOIs
StatePublished - Aug 2019
Event2019 IEEE Information Theory Workshop, ITW 2019 - Visby, Sweden
Duration: Aug 25 2019Aug 28 2019

Publication series

Name2019 IEEE Information Theory Workshop, ITW 2019

Conference

Conference2019 IEEE Information Theory Workshop, ITW 2019
Country/TerritorySweden
CityVisby
Period8/25/198/28/19

ASJC Scopus subject areas

  • Software
  • Computational Theory and Mathematics
  • Computer Networks and Communications
  • Information Systems

Fingerprint

Dive into the research topics of 'Generic Variance Bounds on Estimation and Prediction Errors in Time Series Analysis: An Entropy Perspective'. Together they form a unique fingerprint.

Cite this