A graph theoretic approach to statistical data security
SIAM Journal on Computing
Rough Sets: Theoretical Aspects of Reasoning about Data
Rough Sets: Theoretical Aspects of Reasoning about Data
Machine Learning
Protecting Respondents' Identities in Microdata Release
IEEE Transactions on Knowledge and Data Engineering
How Much Privacy? - A System to Safe Guard Personal Privacy while Releasing Databases
TSCTC '02 Proceedings of the Third International Conference on Rough Sets and Current Trends in Computing
Security Problems for Statistical Databases with General Cell Suppressions
SSDBM '97 Proceedings of the Ninth International Conference on Scientific and Statistical Database Management
A Logical Model for Privacy Protection
ISC '01 Proceedings of the 4th International Conference on Information Security
On the value of private information
TARK '01 Proceedings of the 8th conference on Theoretical aspects of rationality and knowledge
Auditing and Inference Control in Statistical Databases
IEEE Transactions on Software Engineering
An epistemic framework for privacy protection in database linking
Data & Knowledge Engineering
Granulation as a privacy protection mechanism
Transactions on rough sets VII
Unauthorized inferences in semistructured databases
Information Sciences: an International Journal
Hi-index | 0.00 |
We assume a database consists of records of individuals with private or sensitive fields. Queries on the distribution of a sensitive field within a selected population in the database can be submitted to the data center. The answers to the queries leak private information of individuals though no identification information is provided. Inspired by decision theory, we present a quantitative model for the privacy protection problem in such a database query or linkage environment in this paper. In the model, the value of information is estimated from the viewpoint of the querier.To estimate the value, we define the information state of the data user by a class of probability distributions on the set of possible confidential values. We further define the usefulness of information based on how easy the data user can locate individuals that fit the description given in the queries. These states and the usefulness of information can be modified and refined by the user's knowledge acquisition actions. The value of information is then defined as the expected gain of the privacy receiver and the privacy is protected by imposing costs on the answers of the queries for balancing the gain.