STOC '87 Proceedings of the nineteenth annual ACM symposium on Theory of computing
A minimal model for secure computation (extended abstract)
STOC '94 Proceedings of the twenty-sixth annual ACM symposium on Theory of computing
Privacy-preserving data mining
SIGMOD '00 Proceedings of the 2000 ACM SIGMOD international conference on Management of data
Privacy preserving association rule mining in vertically partitioned data
Proceedings of the eighth ACM SIGKDD international conference on Knowledge discovery and data mining
An architecture for privacy-preserving mining of client information
CRPIT '14 Proceedings of the IEEE international conference on Privacy, security and data mining - Volume 14
Privacy-preserving clustering with distributed EM mixture modeling
Knowledge and Information Systems
Privacy-preserving data integration and sharing
Proceedings of the 9th ACM SIGMOD workshop on Research issues in data mining and knowledge discovery
When do data mining results violate privacy?
Proceedings of the tenth ACM SIGKDD international conference on Knowledge discovery and data mining
Privacy Preserving Data Mining Research: Current Status and Key Issues
ICCS '07 Proceedings of the 7th international conference on Computational Science, Part III: ICCS 2007
Publishing naive Bayesian classifiers: privacy without accuracy loss
Proceedings of the VLDB Endowment
Efficient techniques for privacy-preserving sharing of sensitive information
TRUST'11 Proceedings of the 4th international conference on Trust and trustworthy computing
Performance-oriented privacy-preserving data integration
DILS'05 Proceedings of the Second international conference on Data Integration in the Life Sciences
A Knowledge Model Sharing Based Approach to Privacy-Preserving Data Mining
Transactions on Data Privacy
Hi-index | 0.00 |
Homeland security measures are increasing the amount of data collected, processed and mined. At the same time, owners of the data raised legitimate concern about their privacy and potential abuses of the data. Privacy-preserving data mining techniques enable learning models without violating privacy. This paper addresses a complementary problem: What if we want to apply a model without revealing it? This paper presents a method to apply classification rules without revealing either the data or the rules. In addition, the rules can be verified not to use "forbidden" criteria.