Raphael A. Meyer, Cameron Musco, Christopher Musco, David P. Woodruff, Samson Zhou

    Research output: Contribution to conferencePaperpeer-review


    We study the ℓp regression problem, which requires finding x ∈ ℝd that minimizes ǁAx − bǁp for a matrix A ∈ ℝn×d and response vector b ∈ ℝn. There has been recent interest in developing subsampling methods for this problem that can outperform standard techniques when n is very large. However, all known subsampling approaches have run time that depends exponentially on p, typically, dO(p), which can be prohibitively expensive. We improve on this work by showing that for a large class of common structured matrices, such as combinations of low-rank matrices, sparse matrices, and Vandermonde matrices, there are subsampling based methods for ℓp regression that depend polynomially on p. For example, we give an algorithm for ℓp regression on Vandermonde matrices that runs in time O(n log3 n + (dp2)0.5+ω · polylog n), where ω is the exponent of matrix multiplication. The polynomial dependence on p crucially allows our algorithms to extend naturally to efficient algorithms for ` regression, via approximation of ℓ by ℓO(log n). Of practical interest, we also develop a new subsampling algorithm for `p regression for arbitrary matrices, which is simpler than previous approaches for p ≥ 4.

    Original languageEnglish (US)
    StatePublished - 2022
    Event10th International Conference on Learning Representations, ICLR 2022 - Virtual, Online
    Duration: Apr 25 2022Apr 29 2022


    Conference10th International Conference on Learning Representations, ICLR 2022
    CityVirtual, Online

    ASJC Scopus subject areas

    • Language and Linguistics
    • Computer Science Applications
    • Education
    • Linguistics and Language


    Dive into the research topics of 'FAST REGRESSION FOR STRUCTURED INPUTS'. Together they form a unique fingerprint.

    Cite this