Abstract
We introduce a parametric nonlinear transformation that is well-suited for Gaussianizing data from natural images. The data are linearly transformed, and each component is then normalized by a pooled activity measure, computed by ex-ponentiating a weighted sum of rectified and exponentiated components and a constant. We optimize the parameters of the full transformation (linear transform, exponents, weights, constant) over a database of natural images, directly minimizing the negentropy of the responses. The optimized transformation substantially Gaussianizes the data, achieving a significantly smaller mutual information between transformed components than alternative methods including ICA and radial Gaussianization. The transformation is differentiable and can be efficiently inverted, and thus induces a density model on images. We show that samples of this model are visually similar to samples of natural image patches. We demonstrate the use of the model as a prior probability density that can be used to remove additive noise. Finally, we show that the transformation can be cascaded, with each layer optimized using the same Gaussianization objective, thus offering an unsupervised method of optimizing a deep network architecture.
Original language | English (US) |
---|---|
State | Published - Jan 1 2016 |
Event | 4th International Conference on Learning Representations, ICLR 2016 - San Juan, Puerto Rico Duration: May 2 2016 → May 4 2016 |
Conference
Conference | 4th International Conference on Learning Representations, ICLR 2016 |
---|---|
Country/Territory | Puerto Rico |
City | San Juan |
Period | 5/2/16 → 5/4/16 |
ASJC Scopus subject areas
- Education
- Computer Science Applications
- Linguistics and Language
- Language and Linguistics