TY - GEN
T1 - Near-optimal hierarchical matrix approximation from matrix-vector products
AU - Chen, Tyler
AU - Keles, Feyza Duman
AU - Halikias, Diana
AU - Musco, Cameron
AU - Musco, Christopher
AU - Persson, David
N1 - Publisher Copyright:
Copyright © 2025 by SIAM.
PY - 2025
Y1 - 2025
N2 - We describe a randomized algorithm for producing a near-optimal hierarchical off-diagonal low-rank (HODLR) approximation to an n×n matrix A, accessible only though matrix-vector products with A and AT. We prove that, for the rank-k HODLR approximation problem, our method achieves a (1 + β)log(n)-optimal approximation in expected Frobenius norm using O(k log(n)/β3) matrix-vector products. In particular, the algorithm obtains a (1 + ε)-optimal approximation with O(k log4(n)/ε3) matrix-vector products, and for any constant c, an nc-optimal approximation with O(k log(n)) matrix-vector products. Apart from matrix-vector products, the additional computational cost of our method is just O(n poly(log(n), k, β)). We complement the upper bound with a lower bound, which shows that any matrix-vector query algorithm requires at least Ω(k log(n) + k/ε) queries to obtain a (1 + ε)-optimal approximation. Our algorithm can be viewed as a robust version of widely used “peeling” methods for recovering HODLR matrices and is, to the best of our knowledge, the first matrix-vector query algorithm to enjoy theoretical worst-case guarantees for approximation by any hierarchical matrix class. To control the propagation of error between levels of hierarchical approximation, we introduce a new perturbation bound for low-rank approximation, which shows that the widely used Generalized Nyström method enjoys inherent stability when implemented with noisy matrix-vector products. We also introduce a novel randomly perforated matrix sketching method to further control the error in the peeling algorithm.
AB - We describe a randomized algorithm for producing a near-optimal hierarchical off-diagonal low-rank (HODLR) approximation to an n×n matrix A, accessible only though matrix-vector products with A and AT. We prove that, for the rank-k HODLR approximation problem, our method achieves a (1 + β)log(n)-optimal approximation in expected Frobenius norm using O(k log(n)/β3) matrix-vector products. In particular, the algorithm obtains a (1 + ε)-optimal approximation with O(k log4(n)/ε3) matrix-vector products, and for any constant c, an nc-optimal approximation with O(k log(n)) matrix-vector products. Apart from matrix-vector products, the additional computational cost of our method is just O(n poly(log(n), k, β)). We complement the upper bound with a lower bound, which shows that any matrix-vector query algorithm requires at least Ω(k log(n) + k/ε) queries to obtain a (1 + ε)-optimal approximation. Our algorithm can be viewed as a robust version of widely used “peeling” methods for recovering HODLR matrices and is, to the best of our knowledge, the first matrix-vector query algorithm to enjoy theoretical worst-case guarantees for approximation by any hierarchical matrix class. To control the propagation of error between levels of hierarchical approximation, we introduce a new perturbation bound for low-rank approximation, which shows that the widely used Generalized Nyström method enjoys inherent stability when implemented with noisy matrix-vector products. We also introduce a novel randomly perforated matrix sketching method to further control the error in the peeling algorithm.
UR - http://www.scopus.com/inward/record.url?scp=85216024180&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85216024180&partnerID=8YFLogxK
M3 - Conference contribution
AN - SCOPUS:85216024180
T3 - Proceedings of the Annual ACM-SIAM Symposium on Discrete Algorithms
SP - 2656
EP - 2692
BT - Annual ACM-SIAM Symposium on Discrete Algorithms, SODA 2025
PB - Association for Computing Machinery
T2 - 36th Annual ACM-SIAM Symposium on Discrete Algorithms, SODA 2025
Y2 - 12 January 2025 through 15 January 2025
ER -