Abstract
Spectral methods that are based on eigenvectors and eigenvalues of discrete graph Laplacians, such as Diffusion Maps and Laplacian Eigenmaps, are often used for manifold learning and nonlinear dimensionality reduction. Itwas previously shown by Belkin & Niyogi (2007, Convergence of Laplacian eigenmaps, vol. 19. Proceedings of the 2006 Conference on Advances in Neural Information Processing Systems. The MIT Press, p. 129.) that the eigenvectors and eigenvalues of the graph Laplacian converge to the eigenfunctions and eigenvalues of the Laplace-Beltrami operator of the manifold in the limit of infinitely many data points sampled independently from the uniform distribution over the manifold. Recently, we introduced Vector Diffusion Maps and showed that the connection Laplacian of the tangent bundle of the manifold can be approximated from random samples. In this article, we present a unified framework for approximating other connection Laplacians over the manifold by considering its principle bundle structure. We prove that the eigenvectors and eigenvalues of these Laplacians converge in the limit of infinitely many independent random samples. We generalize the spectral convergence results to the case where the data points are sampled from a non-uniform distribution, and for manifolds with and without boundary.
Original language | English (US) |
---|---|
Pages (from-to) | 58-123 |
Number of pages | 66 |
Journal | Information and Inference |
Volume | 6 |
Issue number | 1 |
DOIs | |
State | Published - 2017 |
Keywords
- Connection Laplacian
- Diffusion maps
- Graph connection Laplacian
- Orientable diffusion maps
- Principal bundle
- Vector diffusion distance
- Vector diffusion maps
ASJC Scopus subject areas
- Computational Theory and Mathematics
- Analysis
- Applied Mathematics
- Statistics and Probability
- Numerical Analysis