Strong uniform consistency with rates for kernel density estimators with general kernels on manifolds

Hau Tieng Wu, Nan Wu

Research output: Contribution to journalArticlepeer-review

Abstract

When analyzing modern machine learning algorithms, we may need to handle kernel density estimation (KDE) with intricate kernels that are not designed by the user and might even be irregular and asymmetric. To handle this emerging challenge, we provide a strong uniform consistency result with the $L^\infty $ convergence rate for KDE on Riemannian manifolds with Riemann integrable kernels (in the ambient Euclidean space). We also provide an $L^1$ consistency result for kernel density estimation on Riemannian manifolds with Lebesgue integrable kernels. The isotropic kernels considered in this paper are different from the kernels in the Vapnik-Chervonenkis class that are frequently considered in statistics society. We illustrate the difference when we apply them to estimate the probability density function. Moreover, we elaborate the delicate difference when the kernel is designed on the intrinsic manifold and on the ambient Euclidian space, both might be encountered in practice. At last, we prove the necessary and sufficient condition for an isotropic kernel to be Riemann integrable on a submanifold in the Euclidean space.

Original languageEnglish (US)
Pages (from-to)781-799
Number of pages19
JournalInformation and Inference
Volume11
Issue number2
DOIs
StatePublished - Jun 1 2022

Keywords

  • 2010 Math Subject Classification: 60F15
  • 62G07
  • convergence rate
  • integrability
  • kernel density estimation
  • manifold learning

ASJC Scopus subject areas

  • Analysis
  • Statistics and Probability
  • Numerical Analysis
  • Computational Theory and Mathematics
  • Applied Mathematics

Fingerprint

Dive into the research topics of 'Strong uniform consistency with rates for kernel density estimators with general kernels on manifolds'. Together they form a unique fingerprint.

Cite this