TY - GEN

T1 - Principal component projection without principal component analysis

AU - Frosting, Roy

AU - Musco, Cameron

AU - Musco, Christopher

AU - Sidford, Aaron

PY - 2016

Y1 - 2016

N2 - We show how to efficiently project a vector onto the top principal components of a matrix, without explicitly computing these components. Specifically, we introduce an iterative algorithm that provably computes the projection using few calls to any black-box routine for ridge regression. By avoiding explicit principal component analysis (PCA), our algorithm is the first with no runtime dependence on the number of top principal components. We show that it can be used to give a fast iterative method for the popular principal component regression problem, giving the first major runtime improvement over the naive method of combining PCA with regression. To achieve our results, we first observe that ridge regression can be used to obtain a "smooth projection" onto the top principal components. We then sharpen this approximation to true projection using a low-degree polynomial approximation to the matrix step function. Step function approximation is a topic of long-term interest in scientific computing. We extend prior theory by constructing polynomials with simple iterative structure and rigorously analyzing their behavior under limited precision.copyright

AB - We show how to efficiently project a vector onto the top principal components of a matrix, without explicitly computing these components. Specifically, we introduce an iterative algorithm that provably computes the projection using few calls to any black-box routine for ridge regression. By avoiding explicit principal component analysis (PCA), our algorithm is the first with no runtime dependence on the number of top principal components. We show that it can be used to give a fast iterative method for the popular principal component regression problem, giving the first major runtime improvement over the naive method of combining PCA with regression. To achieve our results, we first observe that ridge regression can be used to obtain a "smooth projection" onto the top principal components. We then sharpen this approximation to true projection using a low-degree polynomial approximation to the matrix step function. Step function approximation is a topic of long-term interest in scientific computing. We extend prior theory by constructing polynomials with simple iterative structure and rigorously analyzing their behavior under limited precision.copyright

UR - http://www.scopus.com/inward/record.url?scp=84999015111&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84999015111&partnerID=8YFLogxK

M3 - Conference contribution

AN - SCOPUS:84999015111

T3 - 33rd International Conference on Machine Learning, ICML 2016

SP - 3495

EP - 3503

BT - 33rd International Conference on Machine Learning, ICML 2016

A2 - Weinberger, Kilian Q.

A2 - Balcan, Maria Florina

PB - International Machine Learning Society (IMLS)

T2 - 33rd International Conference on Machine Learning, ICML 2016

Y2 - 19 June 2016 through 24 June 2016

ER -