TY - JOUR
T1 - Learning Elastic Costs to Shape Monge Displacements
AU - Klein, Michal
AU - Pooladian, Aram Alexandre
AU - Ablin, Pierre
AU - Ndiaye, Eugène
AU - Niles-Weed, Jonathan
AU - Cuturi, Marco
N1 - Publisher Copyright:
© 2024 Neural information processing systems foundation. All rights reserved.
PY - 2024
Y1 - 2024
N2 - Given a source and a target probability measure, the Monge problem studies efficient ways to map the former onto the latter. This efficiency is quantified by defining a cost function between source and target data. Such a cost is often set by default in the machine learning literature to the squared-Euclidean distance, ℓ22(x, y):= 1/2 ∥x − y∥22. The benefits of using elastic costs, defined using a regularizer τ as c(x, y):= ℓ22(x, y)+τ(x−y), was recently highlighted in [Cuturi et al., 2023]. Such costs shape the displacements of Monge maps T, namely the difference between a source point and its image T(x) − x, by giving them a structure that matches that of the proximal operator of τ. In this work, we make two important contributions to the study of elastic costs: (i) For any elastic cost, we propose a numerical method to compute Monge maps that are provably optimal. This provides a much-needed routine to create synthetic problems where the ground-truth OT map is known, by analogy to the Brenier theorem, which states that the gradient of any convex potential is always a valid Monge map for the ℓ22 cost; (ii) We propose a loss to learn the parameter θ of a parameterized regularizer τθ, and apply it in the case where τA(z):= ∥Az∥22. This regularizer promotes displacements that lie on a low-dimensional subspace of Rd, spanned by the p rows of A ∈ Rp×d. We illustrate the soundness of our procedure on synthetic data, generated using our first contribution, in which we show near-perfect recovery of A's subspace using only samples. We demonstrate the applicability of this method by showing predictive improvements on single-cell data tasks.
AB - Given a source and a target probability measure, the Monge problem studies efficient ways to map the former onto the latter. This efficiency is quantified by defining a cost function between source and target data. Such a cost is often set by default in the machine learning literature to the squared-Euclidean distance, ℓ22(x, y):= 1/2 ∥x − y∥22. The benefits of using elastic costs, defined using a regularizer τ as c(x, y):= ℓ22(x, y)+τ(x−y), was recently highlighted in [Cuturi et al., 2023]. Such costs shape the displacements of Monge maps T, namely the difference between a source point and its image T(x) − x, by giving them a structure that matches that of the proximal operator of τ. In this work, we make two important contributions to the study of elastic costs: (i) For any elastic cost, we propose a numerical method to compute Monge maps that are provably optimal. This provides a much-needed routine to create synthetic problems where the ground-truth OT map is known, by analogy to the Brenier theorem, which states that the gradient of any convex potential is always a valid Monge map for the ℓ22 cost; (ii) We propose a loss to learn the parameter θ of a parameterized regularizer τθ, and apply it in the case where τA(z):= ∥Az∥22. This regularizer promotes displacements that lie on a low-dimensional subspace of Rd, spanned by the p rows of A ∈ Rp×d. We illustrate the soundness of our procedure on synthetic data, generated using our first contribution, in which we show near-perfect recovery of A's subspace using only samples. We demonstrate the applicability of this method by showing predictive improvements on single-cell data tasks.
UR - http://www.scopus.com/inward/record.url?scp=105000542986&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=105000542986&partnerID=8YFLogxK
M3 - Conference article
AN - SCOPUS:105000542986
SN - 1049-5258
VL - 37
JO - Advances in Neural Information Processing Systems
JF - Advances in Neural Information Processing Systems
T2 - 38th Conference on Neural Information Processing Systems, NeurIPS 2024
Y2 - 9 December 2024 through 15 December 2024
ER -