### Abstract

Least squares approximation is a technique to find an approximate solution to a system of linear equations that has no exact solution. In a typical setting, one lets n be the number of constraints and d be the number of variables, with n » d. Then, existing exact methods find a solution vector in O(nd^{2}) time. We present two randomized algorithms that provide accurate relative-error approximations to the optimal value and the solution vector of a least squares approximation problem more rapidly than existing exact algorithms. Both of our algorithms preprocess the data with the Randomized Hadamard transform. One then uniformly randomly samples constraints and solves the smaller problem on those constraints, and the other performs a sparse random projection and solves the smaller problem on those projected coordinates. In both cases, solving the smaller problem provides relative-error approximations, and, if n is sufficiently larger than d, the approximate solution can be computed in O(nd ln d) time.

Original language | English (US) |
---|---|

Pages (from-to) | 219-249 |

Number of pages | 31 |

Journal | Numerische Mathematik |

Volume | 117 |

Issue number | 2 |

DOIs | |

State | Published - 2011 |

### ASJC Scopus subject areas

- Computational Mathematics
- Applied Mathematics

## Fingerprint Dive into the research topics of 'Faster least squares approximation'. Together they form a unique fingerprint.

## Cite this

*Numerische Mathematik*,

*117*(2), 219-249. https://doi.org/10.1007/s00211-010-0331-6