Infinite objects in type theory
TYPES '93 Proceedings of the international workshop on Types for proofs and programs
Data mining: practical machine learning tools and techniques with Java implementations
Data mining: practical machine learning tools and techniques with Java implementations
Privacy-preserving data mining
SIGMOD '00 Proceedings of the 2000 ACM SIGMOD international conference on Management of data
Machine Learning
Privacy-Oriented Data Mining by Proof Checking
PKDD '02 Proceedings of the 6th European Conference on Principles of Data Mining and Knowledge Discovery
Inductive Definitions in the system Coq - Rules and Properties
TLCA '93 Proceedings of the International Conference on Typed Lambda Calculi and Applications
Nested General Recursion and Partiality in Type Theory
TPHOLs '01 Proceedings of the 14th International Conference on Theorem Proving in Higher Order Logics
Codifying Guarded Definitions with Recursive Schemes
TYPES '94 Selected papers from the International Workshop on Types for Proofs and Programs
A Typed Lambda Calculus with Categorical Type Constructors
Category Theory and Computer Science
COLOG '88 Proceedings of the International Conference on Computer Logic
Privacy preserving mining of association rules
Proceedings of the eighth ACM SIGKDD international conference on Knowledge discovery and data mining
Transforming data to satisfy privacy constraints
Proceedings of the eighth ACM SIGKDD international conference on Knowledge discovery and data mining
Privacy preserving association rule mining in vertically partitioned data
Proceedings of the eighth ACM SIGKDD international conference on Knowledge discovery and data mining
Journal of Functional Programming
Modelling general recursion in type theory
Mathematical Structures in Computer Science
Language-Based enforcement of privacy policies
PET'04 Proceedings of the 4th international conference on Privacy Enhancing Technologies
Privacy-sensitive information flow with JML
CADE' 20 Proceedings of the 20th international conference on Automated Deduction
Hi-index | 0.00 |
There is growing public concern about personal data collected by both private and public sectors. People have very little control over what kinds of data are stored and how such data is used. Moreover, the ability to infer new knowledge from existing data is increasing rapidly with advances in database and data mining technologies. We describe a solution which allows people to take control by specifying constraints on the ways in which their data can be used. User constraints are represented in formal logic, and organizations that want to use this data provide formal proofs that the software they use to process data meets these constraints. Checking the proof by an independent verifier demonstrates that user constraints are (or are not) respected by this software. Our notion of “privacy correctness” differs from general software correctness in two ways. First, properties of interest are simpler and thus their proofs should be easier to automate. Second, this kind of correctness is stricter; in addition to showing a certain relation between input and output is realized, we must also show that only operations that respect privacy constraints are applied during execution. We have therefore an intensional notion of correctness, rather that the usual extensional one. We discuss how our mechanism can be put into practice, and we present the technical aspects via an example. Our example shows how users can exercise control when their data is to be used as input to a decision tree learning algorithm. We have formalized the example and the proof of preservation of privacy constraints in Coq.