Abstract
Motivated by establishing theoretical foundations for various manifold learning algorithms, we study the problem of Mahalanobis distance (MD) and the associated precision matrix estimation from high-dimensional noisy data. By relying on recent transformative results in covariance matrix estimation, we demonstrate the sensitivity of MD and the associated precision matrix to measurement noise, determining the exact asymptotic signal-to-noise ratio at which MD fails, and quantifying its performance otherwise. In addition, for an appropriate loss function, we propose an asymptotically optimal shrinker, which is shown to be beneficial over the classical implementation of the MD, both analytically and in simulations. The result is extended to the manifold setup, where the nonlinear interaction between curvature and high-dimensional noise is taken care of. The developed solution is applied to study a multi-scale reduction problem in the dynamical system analysis.
Original language | English (US) |
---|---|
Pages (from-to) | 1173-1202 |
Number of pages | 30 |
Journal | Information and Inference |
Volume | 11 |
Issue number | 4 |
DOIs | |
State | Published - Dec 1 2022 |
Keywords
- large p large n
- Mahalanobis distance
- optimal shrinkage
- precision matrix
ASJC Scopus subject areas
- Analysis
- Statistics and Probability
- Numerical Analysis
- Computational Theory and Mathematics
- Applied Mathematics