Differentially-private learning of low dimensional manifolds

Anna Choromanska, Krzysztof Choromanski, Geetha Jagannathan, Claire Monteleoni

Research output: Contribution to journalArticle

Abstract

In this paper, we study the problem of differentially-private learning of low dimensional manifolds embedded in high dimensional spaces. The problems one faces in learning in high dimensional spaces are compounded in a differentially-private learning. We achieve the dual goals of learning the manifold while maintaining the privacy of the dataset by constructing a differentially-private data structure that adapts to the doubling dimension of the dataset. Our differentially-private manifold learning algorithm extends random projection trees of Dasgupta and Freund. A naive construction of differentially-private random projection trees could involve queries with high global sensitivity that would affect the usefulness of the trees. Instead, we present an alternate way of constructing differentially-private random projection trees that uses low sensitivity queries that are precise enough for learning the low dimensional manifolds. We prove that the size of the tree depends only on the doubling dimension of the dataset and not its extrinsic dimension.

Original languageEnglish (US)
Pages (from-to)91-104
Number of pages14
JournalTheoretical Computer Science
Volume620
DOIs
StatePublished - Mar 21 2016

Keywords

  • Differential-privacy
  • Doubling dimension
  • Low dimensional manifolds
  • Random projection tree

ASJC Scopus subject areas

  • Theoretical Computer Science
  • Computer Science(all)

Fingerprint Dive into the research topics of 'Differentially-private learning of low dimensional manifolds'. Together they form a unique fingerprint.

  • Cite this