Generic Bounds on the Maximum Deviations in Sequential Prediction: An Information-Theoretic Analysis

Song Fang, Quanyan Zhu

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

In this paper, we derive generic bounds on the maximum deviations in prediction errors for sequential prediction via an information-theoretic approach. The fundamental bounds are shown to depend only on the conditional entropy of the data point to be predicted given the previous data points. In the asymptotic case, the bounds are achieved if and only if the prediction error is white and uniformly distributed.

Original languageEnglish (US)
Title of host publication2019 IEEE 29th International Workshop on Machine Learning for Signal Processing, MLSP 2019
PublisherIEEE Computer Society
ISBN (Electronic)9781728108247
DOIs
StatePublished - Oct 2019
Event29th IEEE International Workshop on Machine Learning for Signal Processing, MLSP 2019 - Pittsburgh, United States
Duration: Oct 13 2019Oct 16 2019

Publication series

NameIEEE International Workshop on Machine Learning for Signal Processing, MLSP
Volume2019-October
ISSN (Print)2161-0363
ISSN (Electronic)2161-0371

Conference

Conference29th IEEE International Workshop on Machine Learning for Signal Processing, MLSP 2019
CountryUnited States
CityPittsburgh
Period10/13/1910/16/19

Keywords

  • Information-theoretic learning
  • bounds on performance
  • sequence prediction
  • sequential learning
  • sequential prediction

ASJC Scopus subject areas

  • Human-Computer Interaction
  • Signal Processing

Fingerprint Dive into the research topics of 'Generic Bounds on the Maximum Deviations in Sequential Prediction: An Information-Theoretic Analysis'. Together they form a unique fingerprint.

Cite this