TY - JOUR

T1 - Optimal Prediction of Markov Chains With and Without Spectral Gap

AU - Han, Yanjun

AU - Jana, Soham

AU - Wu, Yihong

N1 - Funding Information:
The work of Yihong Wu was supported in part by NSF under Grant CCF-1900507, in part by NSF CAREER under Award CCF-1651588, and in part by the Alfred Sloan Fellowship.
Publisher Copyright:
© 1963-2012 IEEE.

PY - 2023/6/1

Y1 - 2023/6/1

N2 - We study the following learning problem with dependent data: Observing a trajectory of length n from a stationary Markov chain with k states, the goal is to predict the next state. For 3 ≤ k ≤ O(√ n) , using techniques from universal compression, the optimal prediction risk in Kullback-Leibler divergence is shown to be Θ (k2n n k2) , in contrast to the optimal rate of Θ (log n n ) for k=2 previously shown in Falahatgar et al. (2016). These rates, slower than the parametric rate of O(k2 n) , can be attributed to the memory in the data, as the spectral gap of the Markov chain can be arbitrarily small. To quantify the memory effect, we study irreducible reversible chains with a prescribed spectral gap. In addition to characterizing the optimal prediction risk for two states, we show that, as long as the spectral gap is not excessively small, the prediction risk in the Markov model is O\left(k2 n) , which coincides with that of an iid model with the same number of parameters. Extensions to higher-order Markov chains are also obtained.

AB - We study the following learning problem with dependent data: Observing a trajectory of length n from a stationary Markov chain with k states, the goal is to predict the next state. For 3 ≤ k ≤ O(√ n) , using techniques from universal compression, the optimal prediction risk in Kullback-Leibler divergence is shown to be Θ (k2n n k2) , in contrast to the optimal rate of Θ (log n n ) for k=2 previously shown in Falahatgar et al. (2016). These rates, slower than the parametric rate of O(k2 n) , can be attributed to the memory in the data, as the spectral gap of the Markov chain can be arbitrarily small. To quantify the memory effect, we study irreducible reversible chains with a prescribed spectral gap. In addition to characterizing the optimal prediction risk for two states, we show that, as long as the spectral gap is not excessively small, the prediction risk in the Markov model is O\left(k2 n) , which coincides with that of an iid model with the same number of parameters. Extensions to higher-order Markov chains are also obtained.

KW - Kullback Leibler risk

KW - Markov chains

KW - higher-order Markov chains

KW - mixing time

KW - prediction

KW - redundancy

KW - spectral gap

UR - http://www.scopus.com/inward/record.url?scp=85148428092&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85148428092&partnerID=8YFLogxK

U2 - 10.1109/TIT.2023.3239508

DO - 10.1109/TIT.2023.3239508

M3 - Article

AN - SCOPUS:85148428092

SN - 0018-9448

VL - 69

SP - 3920

EP - 3959

JO - IEEE Transactions on Information Theory

JF - IEEE Transactions on Information Theory

IS - 6

ER -