Fast convergence of Langevin dynamics on manifold: Geodesics meet log-Sobolev

Xiao Wang, Qi Lei, Ioannis Panageas

Research output: Contribution to journalConference articlepeer-review

Abstract

Sampling is a fundamental and arguably very important task with numerous applications in Machine Learning. One approach to sample from a high dimensional distribution e-f for some function f is the Langevin Algorithm (LA). Recently, there has been a lot of progress in showing fast convergence of LA even in cases where f is non-convex, notably Vempala and Wibisono [2019], Moitra and Risteski [2020] in which the former paper focuses on functions f defined in Rn and the latter paper focuses on functions with symmetries (like matrix completion type objectives) with manifold structure. Our work generalizes the results of Vempala and Wibisono [2019] where f is defined on a manifold M rather than Rn. From technical point of view, we show that KL decreases in a geometric rate whenever the distribution e-f satisfies a log-Sobolev inequality on M.

Original languageEnglish (US)
JournalAdvances in Neural Information Processing Systems
Volume2020-December
StatePublished - 2020
Event34th Conference on Neural Information Processing Systems, NeurIPS 2020 - Virtual, Online
Duration: Dec 6 2020Dec 12 2020

ASJC Scopus subject areas

  • Computer Networks and Communications
  • Information Systems
  • Signal Processing

Fingerprint

Dive into the research topics of 'Fast convergence of Langevin dynamics on manifold: Geodesics meet log-Sobolev'. Together they form a unique fingerprint.

Cite this