### Abstract

Theoretically elegant and ubiquitous in practice, the Lanczos method can approximate f(A)x for any symmetric matrix A 2 Rn/n, vector x 2 Rn, and function f. In exact arith-metic, the method's error after k iterations is bounded by the error of the best degree-k polynomial uniformly approximating the scalar function f(x) on the range [min(A);max(A)]. However, despite decades of work, it has been unclear if this powerful guarantee holds in finite precision. We resolve this problem, proving that when maxx2[min;max] jf(x)j C, Lanczos essentially matches the exact arithmetic guarantee if computations use roughly log(nCkAk) bits of precision. Our proof extends work of Druskin and Knizhnerman [11], leveraging the stability of the classic Chebyshev recurrence to bound the stability of any polynomial approximating f(x). We also study the special case of f(A) = A-1 for positive de finite A, where stronger guarantees hold for Lanczos. In exact arithmetic the algorithm performs as well as the best polynomial approximating 1=x at each of A's eigenval-ues, rather than on the full range [min(A);max(A)]. In seminal work, Greenbaum gives a natural approach to extending this bound to finite precision: she proves that finite precision Lanczos and the related conjugate gradient method match any polynomial approximating 1=x in a tiny range around each eigenvalue [17]. For A-1, Greenbaum's bound appears stronger than our result. However, we exhibit matrices with condition number where exact arithmetic Lanczos converges in polylog (A) iterations, but Greenbaum's bound predicts at best (1=5) iterations in finite precision. It thus cannot offer more than a polynomial improvement over the O (K1/2) bound achievable via our result for general f(A). Our analysis bounds the power of stable approximating polynomials and raises the question of if they fully characterize the behavior of finite precision Lanczos in solving linear systems. If they do, convergence in less than poly (K) iterations cannot be expected, even for matrices with clustered, skewed, or otherwise favorable eigenvalue distributions.

Original language | English (US) |
---|---|

Title of host publication | 29th Annual ACM-SIAM Symposium on Discrete Algorithms, SODA 2018 |

Editors | Artur Czumaj |

Publisher | Association for Computing Machinery |

Pages | 1605-1624 |

Number of pages | 20 |

ISBN (Electronic) | 9781611975031 |

DOIs | |

State | Published - Jan 1 2018 |

Event | 29th Annual ACM-SIAM Symposium on Discrete Algorithms, SODA 2018 - New Orleans, United States Duration: Jan 7 2018 → Jan 10 2018 |

### Publication series

Name | Proceedings of the Annual ACM-SIAM Symposium on Discrete Algorithms |
---|

### Other

Other | 29th Annual ACM-SIAM Symposium on Discrete Algorithms, SODA 2018 |
---|---|

Country | United States |

City | New Orleans |

Period | 1/7/18 → 1/10/18 |

### ASJC Scopus subject areas

- Software
- Mathematics(all)

## Fingerprint Dive into the research topics of 'Stability of the Lanczos method for matrix function approximation'. Together they form a unique fingerprint.

## Cite this

*29th Annual ACM-SIAM Symposium on Discrete Algorithms, SODA 2018*(pp. 1605-1624). (Proceedings of the Annual ACM-SIAM Symposium on Discrete Algorithms). Association for Computing Machinery. https://doi.org/10.1137/1.9781611975031.105