Nearly linear-time model-based compressive sensing

Chinmay Hegde, Piotr Indyk, Ludwig Schmidt

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

    Abstract

    Compressive sensing is a method for recording a k-sparse signal x ∈ Rn with (possibly noisy) linear measurements of the form y = Ax, where A ∈ Rm × n describes the measurement process. Seminal results in compressive sensing show that it is possible to recover the signal x from m = O(log n/k) measurements and that this is tight. The model-based compressive sensing framework overcomes this lower bound and reduces the number of measurements further to m = O(k). This improvement is achieved by limiting the supports of x to a structured sparsity model, which is a subset of all (kn) possible k-sparse supports. This approach has led to measurement-efficient recovery schemes for a variety of signal models, including tree-sparsity and block-sparsity. While model-based compressive sensing succeeds in reducing the number of measurements, the framework entails a computationally expensive recovery process. In particular, two main barriers arise: (i) Existing recovery algorithms involve several projections into the structured sparsity model. For several sparsity models (such as tree-sparsity), the best known model-projection algorithms run in time Ω(kn), which can be too slow for large k. (ii) Existing recovery algorithms involve several matrix-vector multiplications with the measurement matrix A. Unfortunately, the only known measurement matrices suitable for model-based compressive sensing require O(nk) time for a single multiplication, which can be (again) too slow for large k. In this paper, we remove both aforementioned barriers for two popular sparsity models and reduce the complexity of recovery to nearly linear time. Our main algorithmic result concerns the tree-sparsity model, for which we solve the model-projection problem in O(n logn + k log2 n) time. We also construct a measurement matrix for model-based compressive sensing with matrix-vector multiplication in O(n logn) time for k ≤ n1/2-μ, μ > 0. As an added bonus, the same matrix construction can also be used to give a fast recovery scheme for the block-sparsity model.

    Original languageEnglish (US)
    Title of host publicationAutomata, Languages, and Programming - 41st International Colloquium, ICALP 2014, Proceedings
    PublisherSpringer Verlag
    Pages588-599
    Number of pages12
    EditionPART 1
    ISBN (Print)9783662439470
    DOIs
    StatePublished - 2014
    Event41st International Colloquium on Automata, Languages, and Programming, ICALP 2014 - Copenhagen, Denmark
    Duration: Jul 8 2014Jul 11 2014

    Publication series

    NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
    NumberPART 1
    Volume8572 LNCS
    ISSN (Print)0302-9743
    ISSN (Electronic)1611-3349

    Other

    Other41st International Colloquium on Automata, Languages, and Programming, ICALP 2014
    Country/TerritoryDenmark
    CityCopenhagen
    Period7/8/147/11/14

    Keywords

    • Model-based compressive sensing
    • compressive sensing
    • model-projection
    • restricted isometry property
    • tree-sparsity

    ASJC Scopus subject areas

    • Theoretical Computer Science
    • General Computer Science

    Fingerprint

    Dive into the research topics of 'Nearly linear-time model-based compressive sensing'. Together they form a unique fingerprint.

    Cite this