Faster least squares approximation

Petros Drineas, Michael W. Mahoney, S. Muthukrishnan, Tamás Sarlós

    Research output: Contribution to journalArticlepeer-review

    Abstract

    Least squares approximation is a technique to find an approximate solution to a system of linear equations that has no exact solution. In a typical setting, one lets n be the number of constraints and d be the number of variables, with n » d. Then, existing exact methods find a solution vector in O(nd2) time. We present two randomized algorithms that provide accurate relative-error approximations to the optimal value and the solution vector of a least squares approximation problem more rapidly than existing exact algorithms. Both of our algorithms preprocess the data with the Randomized Hadamard transform. One then uniformly randomly samples constraints and solves the smaller problem on those constraints, and the other performs a sparse random projection and solves the smaller problem on those projected coordinates. In both cases, solving the smaller problem provides relative-error approximations, and, if n is sufficiently larger than d, the approximate solution can be computed in O(nd ln d) time.

    Original languageEnglish (US)
    Pages (from-to)219-249
    Number of pages31
    JournalNumerische Mathematik
    Volume117
    Issue number2
    DOIs
    StatePublished - Feb 2011

    ASJC Scopus subject areas

    • Computational Mathematics
    • Applied Mathematics

    Fingerprint

    Dive into the research topics of 'Faster least squares approximation'. Together they form a unique fingerprint.

    Cite this