## Abstract

We present and analyze a sampling algorithm for the basic linear-algebraic problem of ℓ _{2} regression. The ℓ _{2} regression (or least-squares fit) problem takes as input a matrix A ∈ ℝ ^{n×d} (where we assume n ≫ d) and a target vector b ∈ ℝ ^{n}, and it returns as output cross Z sign = min _{x∈ℝd} |b - Ax| _{2}. Also of interest is x _{opt} = A ^{+}b, where A ^{+} is the Moore-Penrose generalized inverse, which is the minimum-length vector achieving the minimum. Our algorithm randomly samples r rows from the matrix A and vector b to construct an induced ℓ _{2} regression problem with many fewer rows, but with the same number of columns. A crucial feature of the algorithm is the nonuniform sampling probabilities. These probabilities depend in a sophisticated manner on the lengths, i.e., the Euclidean norms, of the rows of the left singular vectors of A and the manner in which b lies in the complement of the column space of A. Under appropriate assumptions, we show relative error approximations for both cross Z sign and x _{opt}. Applications of this sampling methodology are briefly discussed.

Original language | English (US) |
---|---|

Pages | 1127-1136 |

Number of pages | 10 |

DOIs | |

State | Published - 2006 |

Event | Seventeenth Annual ACM-SIAM Symposium on Discrete Algorithms - Miami, FL, United States Duration: Jan 22 2006 → Jan 24 2006 |

### Other

Other | Seventeenth Annual ACM-SIAM Symposium on Discrete Algorithms |
---|---|

Country/Territory | United States |

City | Miami, FL |

Period | 1/22/06 → 1/24/06 |

## ASJC Scopus subject areas

- Software
- General Mathematics

## Fingerprint

Dive into the research topics of 'Sampling algorithms for ℓ_{2}regression and applications'. Together they form a unique fingerprint.