Abstract
We present data-dependent learning bounds for the general scenario of non-stationary non-mixing stochastic processes. Our learning guarantees are expressed in terms of a data-dependent measure of sequential complexity and a discrepancy measure that can be estimated from data under some mild assumptions. Our learning bounds guide the design of new algorithms for non-stationary time series forecasting for which we report several favorable experimental results.
Original language | English (US) |
---|---|
Pages (from-to) | 367-399 |
Number of pages | 33 |
Journal | Annals of Mathematics and Artificial Intelligence |
Volume | 88 |
Issue number | 4 |
DOIs | |
State | Published - Apr 1 2020 |
Keywords
- Discrepancy
- Expected sequential covering numbers
- Forecasting
- Generalization bounds
- Non-mixing
- Non-stationary
- Sequential Rademacher complexity
- Time series
ASJC Scopus subject areas
- Artificial Intelligence
- Applied Mathematics