Discrepancy-Based Theory and Algorithms for Forecasting Non-Stationary Time Series

Vitaly Kuznetsov, Mehryar Mohri

Research output: Contribution to journalArticlepeer-review


We present data-dependent learning bounds for the general scenario of non-stationary non-mixing stochastic processes. Our learning guarantees are expressed in terms of a data-dependent measure of sequential complexity and a discrepancy measure that can be estimated from data under some mild assumptions. Our learning bounds guide the design of new algorithms for non-stationary time series forecasting for which we report several favorable experimental results.

Original languageEnglish (US)
Pages (from-to)367-399
Number of pages33
JournalAnnals of Mathematics and Artificial Intelligence
Issue number4
StatePublished - Apr 1 2020


  • Discrepancy
  • Expected sequential covering numbers
  • Forecasting
  • Generalization bounds
  • Non-mixing
  • Non-stationary
  • Sequential Rademacher complexity
  • Time series

ASJC Scopus subject areas

  • Artificial Intelligence
  • Applied Mathematics


Dive into the research topics of 'Discrepancy-Based Theory and Algorithms for Forecasting Non-Stationary Time Series'. Together they form a unique fingerprint.

Cite this