We present data-dependent learning bounds for the general scenario of non-stationary non-mixing stochastic processes. Our learning guarantees are expressed in terms of a data-dependent measure of sequential complexity and a discrepancy measure that can be estimated from data under some mild assumptions. Our learning bounds guide the design of new algorithms for non-stationary time series forecasting for which we report several favorable experimental results.
- Expected sequential covering numbers
- Generalization bounds
- Sequential Rademacher complexity
- Time series
ASJC Scopus subject areas
- Artificial Intelligence
- Applied Mathematics