Given a finite set of autocorrelations, it is well known that maximization of the entropy functional subject to this data leads to a stable autoregressive (AR) model. Since maximization of the entropy functional is equivalent to maximization of the minimum mean square error associated with one-step predictors, the problem of obtaining admissible extensions that maximize the it-step minimum mean square prediction error subject to the given autocorrelations is meaningful, and it has been shown to result in stable ARMA extensions (see the work by Pillai et al.). The uniqueness of this true generalization of the maximum entropy extension is proved here through a constructive procedure in the case of two-step predictors.
ASJC Scopus subject areas
- Signal Processing
- Electrical and Electronic Engineering