TY - GEN
T1 - Stability of the Lanczos method for matrix function approximation
AU - Musco, Cameron
AU - Musco, Christopher
AU - Sidford, Aaron
N1 - Publisher Copyright:
© Copyright 2018 by SIAM.
PY - 2018
Y1 - 2018
N2 - Theoretically elegant and ubiquitous in practice, the Lanczos method can approximate f(A)x for any symmetric matrix A 2 Rn/n, vector x 2 Rn, and function f. In exact arith-metic, the method's error after k iterations is bounded by the error of the best degree-k polynomial uniformly approximating the scalar function f(x) on the range [min(A);max(A)]. However, despite decades of work, it has been unclear if this powerful guarantee holds in finite precision. We resolve this problem, proving that when maxx2[min;max] jf(x)j C, Lanczos essentially matches the exact arithmetic guarantee if computations use roughly log(nCkAk) bits of precision. Our proof extends work of Druskin and Knizhnerman [11], leveraging the stability of the classic Chebyshev recurrence to bound the stability of any polynomial approximating f(x). We also study the special case of f(A) = A-1 for positive de finite A, where stronger guarantees hold for Lanczos. In exact arithmetic the algorithm performs as well as the best polynomial approximating 1=x at each of A's eigenval-ues, rather than on the full range [min(A);max(A)]. In seminal work, Greenbaum gives a natural approach to extending this bound to finite precision: she proves that finite precision Lanczos and the related conjugate gradient method match any polynomial approximating 1=x in a tiny range around each eigenvalue [17]. For A-1, Greenbaum's bound appears stronger than our result. However, we exhibit matrices with condition number where exact arithmetic Lanczos converges in polylog (A) iterations, but Greenbaum's bound predicts at best (1=5) iterations in finite precision. It thus cannot offer more than a polynomial improvement over the O (K1/2) bound achievable via our result for general f(A). Our analysis bounds the power of stable approximating polynomials and raises the question of if they fully characterize the behavior of finite precision Lanczos in solving linear systems. If they do, convergence in less than poly (K) iterations cannot be expected, even for matrices with clustered, skewed, or otherwise favorable eigenvalue distributions.
AB - Theoretically elegant and ubiquitous in practice, the Lanczos method can approximate f(A)x for any symmetric matrix A 2 Rn/n, vector x 2 Rn, and function f. In exact arith-metic, the method's error after k iterations is bounded by the error of the best degree-k polynomial uniformly approximating the scalar function f(x) on the range [min(A);max(A)]. However, despite decades of work, it has been unclear if this powerful guarantee holds in finite precision. We resolve this problem, proving that when maxx2[min;max] jf(x)j C, Lanczos essentially matches the exact arithmetic guarantee if computations use roughly log(nCkAk) bits of precision. Our proof extends work of Druskin and Knizhnerman [11], leveraging the stability of the classic Chebyshev recurrence to bound the stability of any polynomial approximating f(x). We also study the special case of f(A) = A-1 for positive de finite A, where stronger guarantees hold for Lanczos. In exact arithmetic the algorithm performs as well as the best polynomial approximating 1=x at each of A's eigenval-ues, rather than on the full range [min(A);max(A)]. In seminal work, Greenbaum gives a natural approach to extending this bound to finite precision: she proves that finite precision Lanczos and the related conjugate gradient method match any polynomial approximating 1=x in a tiny range around each eigenvalue [17]. For A-1, Greenbaum's bound appears stronger than our result. However, we exhibit matrices with condition number where exact arithmetic Lanczos converges in polylog (A) iterations, but Greenbaum's bound predicts at best (1=5) iterations in finite precision. It thus cannot offer more than a polynomial improvement over the O (K1/2) bound achievable via our result for general f(A). Our analysis bounds the power of stable approximating polynomials and raises the question of if they fully characterize the behavior of finite precision Lanczos in solving linear systems. If they do, convergence in less than poly (K) iterations cannot be expected, even for matrices with clustered, skewed, or otherwise favorable eigenvalue distributions.
UR - http://www.scopus.com/inward/record.url?scp=85045556820&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85045556820&partnerID=8YFLogxK
U2 - 10.1137/1.9781611975031.105
DO - 10.1137/1.9781611975031.105
M3 - Conference contribution
AN - SCOPUS:85045556820
T3 - Proceedings of the Annual ACM-SIAM Symposium on Discrete Algorithms
SP - 1605
EP - 1624
BT - 29th Annual ACM-SIAM Symposium on Discrete Algorithms, SODA 2018
A2 - Czumaj, Artur
PB - Association for Computing Machinery
T2 - 29th Annual ACM-SIAM Symposium on Discrete Algorithms, SODA 2018
Y2 - 7 January 2018 through 10 January 2018
ER -