TY - GEN
T1 - Differentially-private learning of low dimensional manifolds
AU - Choromanska, Anna
AU - Choromanski, Krzysztof
AU - Jagannathan, Geetha
AU - Monteleoni, Claire
PY - 2013
Y1 - 2013
N2 - In this paper, we study the problem of differentially-private learning of low dimensional manifolds embedded in high dimensional spaces. The problems one faces in learning in high dimensional spaces are compounded in differentially-private learning. We achieve the dual goals of learning the manifold while maintaining the privacy of the dataset by constructing a differentially-private data structure that adapts to the doubling dimension of the dataset. Our differentially-private manifold learning algorithm extends random projection trees of Dasgupta and Freund. A naive construction of differentially-private random projection trees could involve queries with high global sensitivity that would affect the usefulness of the trees. Instead, we present an alternate way of constructing differentially-private random projection trees that uses low sensitivity queries that are precise enough for learning the low dimensional manifolds. We prove that the size of the tree depends only on the doubling dimension of the dataset and not its extrinsic dimension.
AB - In this paper, we study the problem of differentially-private learning of low dimensional manifolds embedded in high dimensional spaces. The problems one faces in learning in high dimensional spaces are compounded in differentially-private learning. We achieve the dual goals of learning the manifold while maintaining the privacy of the dataset by constructing a differentially-private data structure that adapts to the doubling dimension of the dataset. Our differentially-private manifold learning algorithm extends random projection trees of Dasgupta and Freund. A naive construction of differentially-private random projection trees could involve queries with high global sensitivity that would affect the usefulness of the trees. Instead, we present an alternate way of constructing differentially-private random projection trees that uses low sensitivity queries that are precise enough for learning the low dimensional manifolds. We prove that the size of the tree depends only on the doubling dimension of the dataset and not its extrinsic dimension.
UR - http://www.scopus.com/inward/record.url?scp=84887437594&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84887437594&partnerID=8YFLogxK
U2 - 10.1007/978-3-642-40935-6_18
DO - 10.1007/978-3-642-40935-6_18
M3 - Conference contribution
AN - SCOPUS:84887437594
SN - 9783642409349
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 249
EP - 263
BT - Algorithmic Learning Theory - 24th International Conference, ALT 2013, Proceedings
T2 - 24th International Conference on Algorithmic Learning Theory, ALT 2013
Y2 - 6 October 2013 through 9 October 2013
ER -