Mechanism Design via Differential Privacy
FOCS '07 Proceedings of the 48th Annual IEEE Symposium on Foundations of Computer Science
Differential privacy: a survey of results
TAMC'08 Proceedings of the 5th international conference on Theory and applications of models of computation
Approximate privacy: foundations and quantification (extended abstract)
Proceedings of the 11th ACM conference on Electronic commerce
Differentially private combinatorial optimization
SODA '10 Proceedings of the twenty-first annual ACM-SIAM symposium on Discrete Algorithms
Proceedings of the 12th ACM conference on Electronic commerce
Approximately optimal mechanism design via differential privacy
Proceedings of the 3rd Innovations in Theoretical Computer Science Conference
Calibrating noise to sensitivity in private data analysis
TCC'06 Proceedings of the Third conference on Theory of Cryptography
Approximately optimal auctions for selling privacy when costs are correlated with data
Proceedings of the 13th ACM Conference on Electronic Commerce
Privacy-aware mechanism design
Proceedings of the 13th ACM Conference on Electronic Commerce
Conducting truthful surveys, cheaply
Proceedings of the 13th ACM Conference on Electronic Commerce
The Exponential Mechanism for Social Welfare: Private, Truthful, and Nearly Optimal
FOCS '12 Proceedings of the 2012 IEEE 53rd Annual Symposium on Foundations of Computer Science
Privacy auctions for recommender systems
WINE'12 Proceedings of the 8th international conference on Internet and Network Economics
Privacy and coordination: computing on databases with endogenous participation
Proceedings of the fourteenth ACM conference on Electronic commerce
ACM SIGecom Exchanges
Exposing and mitigating privacy loss in crowdsourced survey platforms
Proceedings of the 2013 workshop on Student workhop
Redrawing the boundaries on purchasing data from privacy-sensitive individuals
Proceedings of the 5th conference on Innovations in theoretical computer science
Hi-index | 0.00 |
In this paper, we consider the problem of estimating a potentially sensitive (individually stigmatizing) statistic on a population. In our model, individuals are concerned about their privacy, and experience some cost as a function of their privacy loss. Nevertheless, they would be willing to participate in the survey if they were compensated for their privacy cost. These cost functions are not publicly known, however, nor do we make Bayesian assumptions about their form or distribution. Individuals are rational and will misreport their costs for privacy if doing so is in their best interest. Ghosh and Roth recently showed in this setting, when costs for privacy loss may be correlated with private types, if individuals value differential privacy, no individually rational direct revelation mechanism can compute any non-trivial estimate of the population statistic. In this paper, we circumvent this impossibility result by proposing a modified notion of how individuals experience cost as a function of their privacy loss, and by giving a mechanism which does not operate by direct revelation. Instead, our mechanism has the ability to randomly approach individuals from a population and offer them a take-it-or-leave-it offer. This is intended to model the abilities of a surveyor who may stand on a street corner and approach passers-by.