Abstract
Convergence analyses for the least mean square algorithm with update delay (DLMS) exist, but most of them are based on the unrealistic independence assumption between successive input vectors. In this paper we consider the DLMS algorithm with decreasing step size μ(n) = a/n, a > 0 and prove the almost-sure convergence of the algorithm under the mixing input, satisfaction of the law of large numbers, and uniformly bounded input assumptions.
Original language | English (US) |
---|---|
Pages (from-to) | 1854-1857 |
Number of pages | 4 |
Journal | ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings |
Volume | 3 |
State | Published - 1996 |
Event | Proceedings of the 1996 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP. Part 1 (of 6) - Atlanta, GA, USA Duration: May 7 1996 → May 10 1996 |
ASJC Scopus subject areas
- Software
- Signal Processing
- Electrical and Electronic Engineering