Optimal Prediction of Markov Chains With and Without Spectral Gap

Yanjun Han, Soham Jana, Yihong Wu

Research output: Contribution to journalArticlepeer-review

Abstract

We study the following learning problem with dependent data: Observing a trajectory of length n from a stationary Markov chain with k states, the goal is to predict the next state. For 3 ≤ k ≤ O(√ n) , using techniques from universal compression, the optimal prediction risk in Kullback-Leibler divergence is shown to be Θ (k2n n k2) , in contrast to the optimal rate of Θ (log n n ) for k=2 previously shown in Falahatgar et al. (2016). These rates, slower than the parametric rate of O(k2 n) , can be attributed to the memory in the data, as the spectral gap of the Markov chain can be arbitrarily small. To quantify the memory effect, we study irreducible reversible chains with a prescribed spectral gap. In addition to characterizing the optimal prediction risk for two states, we show that, as long as the spectral gap is not excessively small, the prediction risk in the Markov model is O\left(k2 n) , which coincides with that of an iid model with the same number of parameters. Extensions to higher-order Markov chains are also obtained.

Original languageEnglish (US)
Pages (from-to)3920-3959
Number of pages40
JournalIEEE Transactions on Information Theory
Volume69
Issue number6
DOIs
StatePublished - Jun 1 2023

Keywords

  • Kullback Leibler risk
  • Markov chains
  • higher-order Markov chains
  • mixing time
  • prediction
  • redundancy
  • spectral gap

ASJC Scopus subject areas

  • Information Systems
  • Computer Science Applications
  • Library and Information Sciences

Fingerprint

Dive into the research topics of 'Optimal Prediction of Markov Chains With and Without Spectral Gap'. Together they form a unique fingerprint.

Cite this