Jerome Bobin sent me the following opportunity the other day:

POST-DOC : LEARNING REPRESENTATIONS FOR LARGE-SCALE MULTIVARIATE DATA.

The concept of sparsity and sparse signal representations has led to the development of very efficient analysis methods in imaging science. Most state-of-the-art solutions to classical inverse problems in imaging are grounded on sparsity: denoising, deconvolution, inpainting, blind source separation, etc [SMF10]. Fixed or analytic signal representations, such as the celebrated wavelet transforms, curvelets frames, bandlets, to name only a few [SMF10], allow to compressively encode the inner geometrical structures of generic signals from a few basis elements or atoms. Since compressibility or sparsity is the key principle, dictionary learning techniques [AEB06,RPE13] have more recently been introduced to provide data-driven and therefore more efficient sparse signal representations.

The appeal of dictionary learning techniques lies in their ability to capture a very wide range of signal/image content or morphologies, which make it the perfect analysis tool for analyzing complex real-world datasets. However, these methods have seldom been extended to learn sparse representations of multivariate data such as multi/hyperspectral data, which play a prominent role in scientific fields as different as remote sensing, biomedical imaging or astrophysics. Studying extensions of dictionary learning techniques to derive sparse representations that are specifically tailored for multispectral data is therefore fundamental in imaging science. In this context, the goal of this research project is:

• Extend dictionary learning techniques to analyze multi/hyperspectral data. We will particularly focus on studying dedicated learning strategies to extract sparse multivariate representations.

• Apply and evaluate the proposed representations for solving key inverse problems in multispectral imaging such as missing data interpolation (inpainting), reconstruction from incomplete and incoherent measurements (compressed sensing), etc.

• A particular attention will be paid to the design of learning procedures that can perform in the large-scale setting. This implies that the project will include investigating computationally efficient learning/solving algorithms, with a specific focus on modern-day methods grounded upon non-smooth convex optimization.

These developments will be applied to analyze real-world datasets in astrophysics, which can include the Planck data 1

Any candidate must have a PhD and have a strong background in image/signal processing, especially in sparse signal processing. A good knowledge of convex optimization is a plus.

• Contact: jbobin@cea.fr or florent.sureau@cea.fr

• Laboratory: CEA/IRFU/Cosmostat in Saclay http://www.cosmostat.org

• Financing: European project DEDALE http://dedale.cosmostat.org

• Duration: 3 years : 2015-2018

• Applications are expected prior to May, 31st 2015, the Fermi/LAT data2

[AEB06] M. Aharon, M. Elad, and A. Bruckstein. K-SVD: An algorithm for designing overcomplete dictionaries for sparse representation. ITSP, 54(11), 2006. 4311–4322.

[RPE13] Ron Rubinstein, Tomer Peleg, and Michael Elad. Analysis K-SVD: A Dictionary-Learning Algorithm for the Analysis Sparse Model. IEEE Transactions on Signal Processing, 61(3):661–677, 2013.

[SMF10] J.-L. Starck, F. Murtagh, and M.J. Fadili. Sparse Image and Signal Processing. Cambridge University Press,

2010.

1 http://sci.esa.int/planck/

2 http://fermi.gsfc.nasa.gov

**Join the CompressiveSensing subreddit or the Google+ Community and post there !**

Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

## No comments:

Post a Comment