Abstract
Given a finite set of autocorrelations, it is well known that maximization of the entropy functional subject to this data leads to a stable autoregressive (AR) model. Since maximization of the entropy functional is equivalent to maximization of the minimum mean square error associated with one-step predictors, the problem of obtaining admissible extensions that maximize the it-step minimum mean square prediction error subject to the given autocorrelations is meaningful, and it has been shown to result in stable ARMA extensions (see the work by Pillai et al.). The uniqueness of this true generalization of the maximum entropy extension is proved here through a constructive procedure in the case of two-step predictors.
Original language | English (US) |
---|---|
Pages (from-to) | 2942-2946 |
Number of pages | 5 |
Journal | IEEE Transactions on Signal Processing |
Volume | 41 |
Issue number | 9 |
DOIs | |
State | Published - Sep 1993 |
ASJC Scopus subject areas
- Signal Processing
- Electrical and Electronic Engineering