TY - GEN
T1 - Nearly linear-time model-based compressive sensing
AU - Hegde, Chinmay
AU - Indyk, Piotr
AU - Schmidt, Ludwig
PY - 2014
Y1 - 2014
N2 - Compressive sensing is a method for recording a k-sparse signal x ∈ Rn with (possibly noisy) linear measurements of the form y = Ax, where A ∈ Rm × n describes the measurement process. Seminal results in compressive sensing show that it is possible to recover the signal x from m = O(log n/k) measurements and that this is tight. The model-based compressive sensing framework overcomes this lower bound and reduces the number of measurements further to m = O(k). This improvement is achieved by limiting the supports of x to a structured sparsity model, which is a subset of all (kn) possible k-sparse supports. This approach has led to measurement-efficient recovery schemes for a variety of signal models, including tree-sparsity and block-sparsity. While model-based compressive sensing succeeds in reducing the number of measurements, the framework entails a computationally expensive recovery process. In particular, two main barriers arise: (i) Existing recovery algorithms involve several projections into the structured sparsity model. For several sparsity models (such as tree-sparsity), the best known model-projection algorithms run in time Ω(kn), which can be too slow for large k. (ii) Existing recovery algorithms involve several matrix-vector multiplications with the measurement matrix A. Unfortunately, the only known measurement matrices suitable for model-based compressive sensing require O(nk) time for a single multiplication, which can be (again) too slow for large k. In this paper, we remove both aforementioned barriers for two popular sparsity models and reduce the complexity of recovery to nearly linear time. Our main algorithmic result concerns the tree-sparsity model, for which we solve the model-projection problem in O(n logn + k log2 n) time. We also construct a measurement matrix for model-based compressive sensing with matrix-vector multiplication in O(n logn) time for k ≤ n1/2-μ, μ > 0. As an added bonus, the same matrix construction can also be used to give a fast recovery scheme for the block-sparsity model.
AB - Compressive sensing is a method for recording a k-sparse signal x ∈ Rn with (possibly noisy) linear measurements of the form y = Ax, where A ∈ Rm × n describes the measurement process. Seminal results in compressive sensing show that it is possible to recover the signal x from m = O(log n/k) measurements and that this is tight. The model-based compressive sensing framework overcomes this lower bound and reduces the number of measurements further to m = O(k). This improvement is achieved by limiting the supports of x to a structured sparsity model, which is a subset of all (kn) possible k-sparse supports. This approach has led to measurement-efficient recovery schemes for a variety of signal models, including tree-sparsity and block-sparsity. While model-based compressive sensing succeeds in reducing the number of measurements, the framework entails a computationally expensive recovery process. In particular, two main barriers arise: (i) Existing recovery algorithms involve several projections into the structured sparsity model. For several sparsity models (such as tree-sparsity), the best known model-projection algorithms run in time Ω(kn), which can be too slow for large k. (ii) Existing recovery algorithms involve several matrix-vector multiplications with the measurement matrix A. Unfortunately, the only known measurement matrices suitable for model-based compressive sensing require O(nk) time for a single multiplication, which can be (again) too slow for large k. In this paper, we remove both aforementioned barriers for two popular sparsity models and reduce the complexity of recovery to nearly linear time. Our main algorithmic result concerns the tree-sparsity model, for which we solve the model-projection problem in O(n logn + k log2 n) time. We also construct a measurement matrix for model-based compressive sensing with matrix-vector multiplication in O(n logn) time for k ≤ n1/2-μ, μ > 0. As an added bonus, the same matrix construction can also be used to give a fast recovery scheme for the block-sparsity model.
KW - Model-based compressive sensing
KW - compressive sensing
KW - model-projection
KW - restricted isometry property
KW - tree-sparsity
UR - http://www.scopus.com/inward/record.url?scp=84904154118&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84904154118&partnerID=8YFLogxK
U2 - 10.1007/978-3-662-43948-7_49
DO - 10.1007/978-3-662-43948-7_49
M3 - Conference contribution
AN - SCOPUS:84904154118
SN - 9783662439470
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 588
EP - 599
BT - Automata, Languages, and Programming - 41st International Colloquium, ICALP 2014, Proceedings
PB - Springer Verlag
T2 - 41st International Colloquium on Automata, Languages, and Programming, ICALP 2014
Y2 - 8 July 2014 through 11 July 2014
ER -