How robust are linear sketches to adaptive inputs?
Proceedings of the forty-fifth annual ACM symposium on Theory of computing
Differential privacy for the analyst via private equilibrium computation
Proceedings of the forty-fifth annual ACM symposium on Theory of computing
Mechanism design in large games: incentives and privacy
Proceedings of the 5th conference on Innovations in theoretical computer science
Hi-index | 0.00 |
We initiate the study of "privacy for the analyst" in differentially private data analysis. That is, not only will we be concerned with ensuring differential privacy for the data (i.e. individuals or customers), which are the usual concern of differential privacy, but we also consider (differential) privacy for the set of queries posed by each data analyst. The goal is to achieve privacy with respect to other analysts, or users of the system. This problem arises only in the context of stateful privacy mechanisms, in which the responses to queries depend on other queries posed (a recent wave of results in the area utilized cleverly coordinated noise and state in order to allow answering privately hugely many queries). We argue that the problem is real by proving an exponential gap between the number of queries that can be answered (with non-trivial error) by stateless and stateful differentially private mechanisms. We then give a stateful algorithm for differentially private data analysis that also ensures differential privacy for the analyst and can answer exponentially many queries.