### Abstract

Compressive sensing is a method for recording a k-sparse signal x ∈ R^{n} with (possibly noisy) linear measurements of the form y = Ax, where A ∈ R^{m × n} describes the measurement process. Seminal results in compressive sensing show that it is possible to recover the signal x from m = O(log n/k) measurements and that this is tight. The model-based compressive sensing framework overcomes this lower bound and reduces the number of measurements further to m = O(k). This improvement is achieved by limiting the supports of x to a structured sparsity model, which is a subset of all (_{k}^{n}) possible k-sparse supports. This approach has led to measurement-efficient recovery schemes for a variety of signal models, including tree-sparsity and block-sparsity. While model-based compressive sensing succeeds in reducing the number of measurements, the framework entails a computationally expensive recovery process. In particular, two main barriers arise: (i) Existing recovery algorithms involve several projections into the structured sparsity model. For several sparsity models (such as tree-sparsity), the best known model-projection algorithms run in time Ω(kn), which can be too slow for large k. (ii) Existing recovery algorithms involve several matrix-vector multiplications with the measurement matrix A. Unfortunately, the only known measurement matrices suitable for model-based compressive sensing require O(nk) time for a single multiplication, which can be (again) too slow for large k. In this paper, we remove both aforementioned barriers for two popular sparsity models and reduce the complexity of recovery to nearly linear time. Our main algorithmic result concerns the tree-sparsity model, for which we solve the model-projection problem in O(n logn + k log^{2} n) time. We also construct a measurement matrix for model-based compressive sensing with matrix-vector multiplication in O(n logn) time for k ≤ n^{1/2-μ}, μ > 0. As an added bonus, the same matrix construction can also be used to give a fast recovery scheme for the block-sparsity model.

Original language | English (US) |
---|---|

Title of host publication | Automata, Languages, and Programming - 41st International Colloquium, ICALP 2014, Proceedings |

Publisher | Springer Verlag |

Pages | 588-599 |

Number of pages | 12 |

Edition | PART 1 |

ISBN (Print) | 9783662439470 |

DOIs | |

State | Published - 2014 |

Event | 41st International Colloquium on Automata, Languages, and Programming, ICALP 2014 - Copenhagen, Denmark Duration: Jul 8 2014 → Jul 11 2014 |

### Publication series

Name | Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) |
---|---|

Number | PART 1 |

Volume | 8572 LNCS |

ISSN (Print) | 0302-9743 |

ISSN (Electronic) | 1611-3349 |

### Other

Other | 41st International Colloquium on Automata, Languages, and Programming, ICALP 2014 |
---|---|

Country | Denmark |

City | Copenhagen |

Period | 7/8/14 → 7/11/14 |

### Keywords

- Model-based compressive sensing
- compressive sensing
- model-projection
- restricted isometry property
- tree-sparsity

### ASJC Scopus subject areas

- Theoretical Computer Science
- Computer Science(all)

## Fingerprint Dive into the research topics of 'Nearly linear-time model-based compressive sensing'. Together they form a unique fingerprint.

## Cite this

*Automata, Languages, and Programming - 41st International Colloquium, ICALP 2014, Proceedings*(PART 1 ed., pp. 588-599). (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 8572 LNCS, No. PART 1). Springer Verlag. https://doi.org/10.1007/978-3-662-43948-7_49