Robust De-anonymization of Large Sparse Datasets
SP '08 Proceedings of the 2008 IEEE Symposium on Security and Privacy
De-anonymizing Social Networks
SP '09 Proceedings of the 2009 30th IEEE Symposium on Security and Privacy
Accurate Estimation of the Degree Distribution of Private Networks
ICDM '09 Proceedings of the 2009 Ninth IEEE International Conference on Data Mining
A Differentially Private Graph Estimator
ICDMW '09 Proceedings of the 2009 IEEE International Conference on Data Mining Workshops
Differential privacy: a survey of results
TAMC'08 Proceedings of the 5th international conference on Theory and applications of models of computation
Boosting the accuracy of differentially private histograms through consistency
Proceedings of the VLDB Endowment
Personalized social recommendations: accurate or private
Proceedings of the VLDB Endowment
Sharing graphs using differentially private graph models
Proceedings of the 2011 ACM SIGCOMM conference on Internet measurement conference
Iterative constructions and private data release
TCC'12 Proceedings of the 9th international conference on Theory of Cryptography
Hi-index | 0.00 |
Privacy of social network data is a growing concern which threatens to limit access to this valuable data source. Analysis of the graph structure of social networks can provide valuable information for revenue generation and social science research, but unfortunately, ensuring this analysis does not violate individual privacy is difficult. Simply anonymizing graphs or even releasing only aggregate results of analysis may not provide sufficient protection. Differential privacy is an alternative privacy model, popular in data-mining over tabular data, which uses noise to obscure individuals' contributions to aggregate results and offers a very strong mathematical guarantee that individuals' presence in the data-set is hidden. Analyses that were previously vulnerable to identification of individuals and extraction of private data may be safely released under differential-privacy guarantees. We review two existing standards for adapting differential privacy to network data and analyse the feasibility of several common social-network analysis techniques under these standards. Additionally, we propose out-link privacy, a novel standard for differential privacy over network data, and introduce two powerful out-link private algorithms for common network analysis techniques that were infeasible to privatize under previous differential privacy standards.