Abstract
We analyze Oja’s algorithm for streaming k-PCA, and prove that it achieves performance nearly matching that of an optimal offline algorithm. Given access to a sequence of i.i.d. d × d symmetric matrices, we show that Oja’s algorithm can obtain an accurate approximation to the subspace of the top k eigenvectors of their expectation using a number of samples that scales polylogarithmically with d. Previously, such a result was only known in the case where the updates have rank one. Our analysis is based on recently developed matrix concentration tools, which allow us to prove strong bounds on the tails of the random matrices which arise in the course of the algorithm’s execution.
Original language | English (US) |
---|---|
Pages (from-to) | 2463-2498 |
Number of pages | 36 |
Journal | Proceedings of Machine Learning Research |
Volume | 134 |
State | Published - 2021 |
Event | 34th Conference on Learning Theory, COLT 2021 - Boulder, United States Duration: Aug 15 2021 → Aug 19 2021 |
Keywords
- Oja’s algorithm
- Streaming PCA
- non-convex optimization
- products of random matrices
ASJC Scopus subject areas
- Artificial Intelligence
- Software
- Control and Systems Engineering
- Statistics and Probability