Sparse Regularization via Convex Analysis

Research output: Contribution to journalArticlepeer-review

Abstract

Sparse approximate solutions to linear equations are classically obtained via L1 norm regularized least squares, but this method often underestimates the true solution. As an alternative to the L1 norm, this paper proposes a class of nonconvex penalty functions that maintain the convexity of the least squares cost function to be minimized, and avoids the systematic underestimation characteristic of L1 norm regularization. The proposed penalty function is a multivariate generalization of the minimax-concave penalty. It is defined in terms of a new multivariate generalization of the Huber function, which in turn is defined via infimal convolution. The proposed sparse-regularized least squares cost function can be minimized by proximal algorithms comprising simple computations.

Original languageEnglish (US)
Article number7938377
Pages (from-to)4481-4494
Number of pages14
JournalIEEE Transactions on Signal Processing
Volume65
Issue number17
DOIs
StatePublished - Sep 1 2017

Keywords

  • Sparse regularization
  • convex function
  • denoising
  • optimization
  • sparse approximation

ASJC Scopus subject areas

  • Signal Processing
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'Sparse Regularization via Convex Analysis'. Together they form a unique fingerprint.

Cite this