Learning about an exponential amount of conditional distributions

Mohamed Ishmael Belghazi, Maxime Oquab, Yann Lecun, David Lopez-Paz

Research output: Contribution to journalConference articlepeer-review

Abstract

We introduce the Neural Conditioner (NC), a self-supervised machine able to learn about all the conditional distributions of a random vector X. The NC is a function NC(x · a, a, r) that leverages adversarial training to match each conditional distribution P (Xr|Xa = xa). After training, the NC generalizes to sample conditional distributions never seen, including the joint distribution. The NC is also able to auto-encode examples, providing data representations useful for downstream classification tasks. In sum, the NC integrates different self-supervised tasks (each being the estimation of a conditional distribution) and levels of supervision (partially observed data) seamlessly into a single learning experience.

Original languageEnglish (US)
JournalAdvances in Neural Information Processing Systems
Volume32
StatePublished - 2019
Event33rd Annual Conference on Neural Information Processing Systems, NeurIPS 2019 - Vancouver, Canada
Duration: Dec 8 2019Dec 14 2019

ASJC Scopus subject areas

  • Computer Networks and Communications
  • Information Systems
  • Signal Processing

Fingerprint

Dive into the research topics of 'Learning about an exponential amount of conditional distributions'. Together they form a unique fingerprint.

Cite this