TY - GEN
T1 - Sampling schemes and parameter estimation for nonlinear Bernoulli-Gaussian sparse models
AU - Boudineau, Megane
AU - Carfantan, Herve
AU - Bourguignon, Sebastien
AU - Bazot, Michael
N1 - Publisher Copyright:
© 2016 IEEE.
PY - 2016/8/24
Y1 - 2016/8/24
N2 - We address the sparse approximation problem in the case where the data are approximated by the linear combination of a small number of elementary signals, each of these signals depending non-linearly on additional parameters. Sparsity is explicitly expressed through a Bernoulli-Gaussian hierarchical model in a Bayesian framework. Posterior mean estimates are computed using Markov Chain Monte-Carlo algorithms. We generalize the partially marginalized Gibbs sampler proposed in the linear case in [1], and build an hybrid Hastings-within-Gibbs algorithm in order to account for the nonlinear parameters. All model parameters are then estimated in an unsupervised procedure. The resulting method is evaluated on a sparse spectral analysis problem. It is shown to converge more efficiently than the classical joint estimation procedure, with only a slight increase of the computational cost per iteration, consequently reducing the global cost of the estimation procedure.
AB - We address the sparse approximation problem in the case where the data are approximated by the linear combination of a small number of elementary signals, each of these signals depending non-linearly on additional parameters. Sparsity is explicitly expressed through a Bernoulli-Gaussian hierarchical model in a Bayesian framework. Posterior mean estimates are computed using Markov Chain Monte-Carlo algorithms. We generalize the partially marginalized Gibbs sampler proposed in the linear case in [1], and build an hybrid Hastings-within-Gibbs algorithm in order to account for the nonlinear parameters. All model parameters are then estimated in an unsupervised procedure. The resulting method is evaluated on a sparse spectral analysis problem. It is shown to converge more efficiently than the classical joint estimation procedure, with only a slight increase of the computational cost per iteration, consequently reducing the global cost of the estimation procedure.
UR - http://www.scopus.com/inward/record.url?scp=84987864228&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84987864228&partnerID=8YFLogxK
U2 - 10.1109/SSP.2016.7551706
DO - 10.1109/SSP.2016.7551706
M3 - Conference contribution
AN - SCOPUS:84987864228
T3 - IEEE Workshop on Statistical Signal Processing Proceedings
BT - 2016 19th IEEE Statistical Signal Processing Workshop, SSP 2016
PB - IEEE Computer Society
T2 - 19th IEEE Statistical Signal Processing Workshop, SSP 2016
Y2 - 25 June 2016 through 29 June 2016
ER -