Abstract
Convex optimization with sparsity-promoting convex regularization is a standard approach for estimating sparse signals in noise. In order to promote sparsity more strongly than convex regularization, it is also standard practice to employ non-convex optimization. In this paper, we take a third approach. We utilize a non-convex regularization term chosen such that the total cost function (consisting of data consistency and regularization terms) is convex. Therefore, sparsity is more strongly promoted than in the standard convex formulation, but without sacrificing the attractive aspects of convex optimization (unique minimum, robust algorithms, etc.). We use this idea to improve the recently developed 'overlapping group shrinkage' (OGS) algorithm for the denoising of group-sparse signals. The algorithm is applied to the problem of speech enhancement with favorable results in terms of both SNR and perceptual quality.
Original language | English (US) |
---|---|
Article number | 6826555 |
Pages (from-to) | 3464-3478 |
Number of pages | 15 |
Journal | IEEE Transactions on Signal Processing |
Volume | 62 |
Issue number | 13 |
DOIs | |
State | Published - Jul 1 2014 |
Keywords
- Convex optimization
- denoising
- group sparse model
- non-convex optimization
- sparse optimization
- speech enhancement
- translation-invariant denoising
ASJC Scopus subject areas
- Signal Processing
- Electrical and Electronic Engineering