Nonmonotone Spectral Projected Gradient Methods on Convex Sets
SIAM Journal on Optimization
Revealing information while preserving privacy
Proceedings of the twenty-second ACM SIGMOD-SIGACT-SIGART symposium on Principles of database systems
Mechanism Design via Differential Privacy
FOCS '07 Proceedings of the 48th Annual IEEE Symposium on Foundations of Computer Science
A learning theory approach to non-interactive database privacy
STOC '08 Proceedings of the fortieth annual ACM symposium on Theory of computing
Efficient projections onto the l1-ball for learning in high dimensions
Proceedings of the 25th international conference on Machine learning
Differentially private recommender systems: building privacy into the net
Proceedings of the 15th ACM SIGKDD international conference on Knowledge discovery and data mining
On the geometry of differential privacy
Proceedings of the forty-second ACM symposium on Theory of computing
Optimizing linear counting queries under differential privacy
Proceedings of the twenty-ninth ACM SIGMOD-SIGACT-SIGART symposium on Principles of database systems
Differentially private aggregation of distributed time-series with transformation and encryption
Proceedings of the 2010 ACM SIGMOD International Conference on Management of data
Data mining with differential privacy
Proceedings of the 16th ACM SIGKDD international conference on Knowledge discovery and data mining
Discovering frequent patterns in sensitive data
Proceedings of the 16th ACM SIGKDD international conference on Knowledge discovery and data mining
Differentially-private network trace analysis
Proceedings of the ACM SIGCOMM 2010 conference
Boosting and Differential Privacy
FOCS '10 Proceedings of the 2010 IEEE 51st Annual Symposium on Foundations of Computer Science
Boosting the accuracy of differentially private histograms through consistency
Proceedings of the VLDB Endowment
Differentially private data cubes: optimizing noise sources and consistency
Proceedings of the 2011 ACM SIGMOD International Conference on Management of data
iReduct: differential privacy with reduced relative errors
Proceedings of the 2011 ACM SIGMOD International Conference on Management of data
Differentially private data release for data mining
Proceedings of the 17th ACM SIGKDD international conference on Knowledge discovery and data mining
Differentially Private Empirical Risk Minimization
The Journal of Machine Learning Research
Compressive mechanism: utilizing sparse representation in differential privacy
Proceedings of the 10th annual ACM workshop on Privacy in the electronic society
Calibrating noise to sensitivity in private data analysis
TCC'06 Proceedings of the Third conference on Theory of Cryptography
Differentially Private Spatial Decompositions
ICDE '12 Proceedings of the 2012 IEEE 28th International Conference on Data Engineering
Differentially Private Histogram Publication
ICDE '12 Proceedings of the 2012 IEEE 28th International Conference on Data Engineering
Non-interactive differential privacy: a survey
Proceedings of the First International Workshop on Open Data
Optimal error of query sets under the differentially-private matrix mechanism
Proceedings of the 16th International Conference on Database Theory
PrivGene: differentially private model fitting using genetic algorithms
Proceedings of the 2013 ACM SIGMOD International Conference on Management of Data
Practical differential privacy via grouping and smoothing
Proceedings of the VLDB Endowment
The geometry of differential privacy: the sparse and approximate cases
Proceedings of the forty-fifth annual ACM symposium on Theory of computing
Understanding hierarchical methods for differentially private histograms
Proceedings of the VLDB Endowment
Differentially private histogram publication
The VLDB Journal — The International Journal on Very Large Data Bases
Hi-index | 0.00 |
Differential privacy is a promising privacy-preserving paradigm for statistical query processing over sensitive data. It works by injecting random noise into each query result, such that it is provably hard for the adversary to infer the presence or absence of any individual record from the published noisy results. The main objective in differentially private query processing is to maximize the accuracy of the query results, while satisfying the privacy guarantees. Previous work, notably the matrix mechanism [16], has suggested that processing a batch of correlated queries as a whole can potentially achieve considerable accuracy gains, compared to answering them individually. However, as we point out in this paper, the matrix mechanism is mainly of theoretical interest; in particular, several inherent problems in its design limit its accuracy in practice, which almost never exceeds that of naïve methods. In fact, we are not aware of any existing solution that can effectively optimize a query batch under differential privacy. Motivated by this, we propose the Low-Rank Mechanism (LRM), the first practical differentially private technique for answering batch queries with high accuracy, based on a low rank approximation of the workload matrix. We prove that the accuracy provided by LRM is close to the theoretical lower bound for any mechanism to answer a batch of queries under differential privacy. Extensive experiments using real data demonstrate that LRM consistently outperforms state-of-the-art query processing solutions under differential privacy, by large margins.