TY - JOUR
T1 - Differentially-private learning of low dimensional manifolds
AU - Choromanska, Anna
AU - Choromanski, Krzysztof
AU - Jagannathan, Geetha
AU - Monteleoni, Claire
N1 - Publisher Copyright:
© 2015 Elsevier B.V.
PY - 2016/3/21
Y1 - 2016/3/21
N2 - In this paper, we study the problem of differentially-private learning of low dimensional manifolds embedded in high dimensional spaces. The problems one faces in learning in high dimensional spaces are compounded in a differentially-private learning. We achieve the dual goals of learning the manifold while maintaining the privacy of the dataset by constructing a differentially-private data structure that adapts to the doubling dimension of the dataset. Our differentially-private manifold learning algorithm extends random projection trees of Dasgupta and Freund. A naive construction of differentially-private random projection trees could involve queries with high global sensitivity that would affect the usefulness of the trees. Instead, we present an alternate way of constructing differentially-private random projection trees that uses low sensitivity queries that are precise enough for learning the low dimensional manifolds. We prove that the size of the tree depends only on the doubling dimension of the dataset and not its extrinsic dimension.
AB - In this paper, we study the problem of differentially-private learning of low dimensional manifolds embedded in high dimensional spaces. The problems one faces in learning in high dimensional spaces are compounded in a differentially-private learning. We achieve the dual goals of learning the manifold while maintaining the privacy of the dataset by constructing a differentially-private data structure that adapts to the doubling dimension of the dataset. Our differentially-private manifold learning algorithm extends random projection trees of Dasgupta and Freund. A naive construction of differentially-private random projection trees could involve queries with high global sensitivity that would affect the usefulness of the trees. Instead, we present an alternate way of constructing differentially-private random projection trees that uses low sensitivity queries that are precise enough for learning the low dimensional manifolds. We prove that the size of the tree depends only on the doubling dimension of the dataset and not its extrinsic dimension.
KW - Differential-privacy
KW - Doubling dimension
KW - Low dimensional manifolds
KW - Random projection tree
UR - http://www.scopus.com/inward/record.url?scp=84958280835&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84958280835&partnerID=8YFLogxK
U2 - 10.1016/j.tcs.2015.10.039
DO - 10.1016/j.tcs.2015.10.039
M3 - Article
AN - SCOPUS:84958280835
SN - 0304-3975
VL - 620
SP - 91
EP - 104
JO - Theoretical Computer Science
JF - Theoretical Computer Science
ER -