Abstract
This work introduces a method for learning low-dimensional models from data of high-dimensional black-box dynamical systems. The novelty is that the learned models are exactly the reduced models that are traditionally constructed with classical projection-based model reduction techniques. Thus, the proposed approach learns models that are guaranteed to have the well-studied properties of reduced models known from model reduction, without requiring full knowledge of the governing equations and without requiring the operators of the high-dimensional systems. The key ingredient is a new data sampling scheme to obtain re-projected trajectories of high-dimensional systems that correspond to Markovian dynamics in low-dimensional subspaces. The exact recovery of reduced models from these re-projected trajectories is guaranteed preasymptotically under certain conditions for finite amounts of data and for a large class of systems with polynomial nonlinear terms. Numerical results demonstrate that the low-dimensional models learned with the proposed approach match reduced models from traditional model reduction up to numerical errors in practice. The numerical results further indicate that low-dimensional models fitted to re-projected trajectories are predictive even in situations where models fitted to trajectories without re-projection are inaccurate and unstable.
Original language | English (US) |
---|---|
Pages (from-to) | A3489-A3515 |
Journal | SIAM Journal on Scientific Computing |
Volume | 42 |
Issue number | 5 |
DOIs | |
State | Published - 2020 |
Keywords
- Data-driven modeling
- Nonintrusive model reduction
- Operator inference
- Proper orthogonal decomposition
- Reduced basis method
ASJC Scopus subject areas
- Computational Mathematics
- Applied Mathematics