Generalizing data to provide anonymity when disclosing information (abstract)
PODS '98 Proceedings of the seventeenth ACM SIGACT-SIGMOD-SIGART symposium on Principles of database systems
Indirect Association: Mining Higher Order Dependencies in Data
PKDD '00 Proceedings of the 4th European Conference on Principles of Data Mining and Knowledge Discovery
Authenticated Encryption: Relations among Notions and Analysis of the Generic Composition Paradigm
ASIACRYPT '00 Proceedings of the 6th International Conference on the Theory and Application of Cryptology and Information Security: Advances in Cryptology
k-anonymity: a model for protecting privacy
International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems
Information revelation and privacy in online social networks
Proceedings of the 2005 ACM workshop on Privacy in the electronic society
Answering complex questions with random walk models
SIGIR '06 Proceedings of the 29th annual international ACM SIGIR conference on Research and development in information retrieval
Answering relationship queries on the web
Proceedings of the 16th international conference on World Wide Web
Privacy-enhanced sharing of personal content on the web
Proceedings of the 17th international conference on World Wide Web
de-linkability: a privacy-preserving constraint for safely outsourcing multimedia documents
Proceedings of the Fifth International Conference on Management of Emergent Digital EcoSystems
Hi-index | 0.00 |
The Web is the largest repository of information. Personal information is usually scattered on various pages of different websites. Search engines have made it easier to find personal information. An attacker may collect a user's scattered information together via search engines, and infer some privacy information. We call this kind of privacy attack Privacy Inference Attack via Search Engines. In this paper, we propose a user-side automatic detection service for detecting the privacy leakage before publishing personal information. In the user-side service, we construct a User Information Correlation (UICA) graph to model the association between user information returned by search engines. We map the privacy inference attack into a decision problem of searching a privacy inferring path with the maximal probability in the UICA graph. We propose a Privacy Leakage Detection Probability (PLD-Probability) algorithm to find the privacy inferring path. Extensive experiments indicate that the algorithm is reasonable and effective.