Provable Compressed Sensing with Generative Priors via Langevin Dynamics

Thanh V. Nguyen, Gauri Jagatap, Chinmay Hegde

    Research output: Contribution to journalArticlepeer-review

    Abstract

    Deep generative models have emerged as a powerful class of priors for signals in various inverse problems such as compressed sensing, phase retrieval and super-resolution. In this work, we consider the compressed sensing problem and assume the unknown signal to lie in the range of some pre-trained generative model. A popular approach for signal recovery is via gradient descent in the low-dimensional latent space. While gradient descent has achieved good empirical performance, its theoretical behavior is not well understood. We introduce the use of stochastic gradient Langevin dynamics (SGLD) for compressed sensing with a generative prior. Under mild assumptions on the generative model, we prove the convergence of SGLD to the true signal. We also demonstrate competitive empirical performance to standard gradient descent.

    Original languageEnglish (US)
    Pages (from-to)1
    Number of pages1
    JournalIEEE Transactions on Information Theory
    DOIs
    StateAccepted/In press - 2022

    Keywords

    • Compressed sensing
    • compressed sensing
    • Convergence
    • generative models
    • Generators
    • Heuristic algorithms
    • Inverse problems
    • Langevin dynamics
    • Standards
    • Stochastic processes

    ASJC Scopus subject areas

    • Information Systems
    • Computer Science Applications
    • Library and Information Sciences

    Fingerprint

    Dive into the research topics of 'Provable Compressed Sensing with Generative Priors via Langevin Dynamics'. Together they form a unique fingerprint.

    Cite this