Learning latent variable models from distributed and abstracted data
Information Sciences: an International Journal
Quadratic error minimization in a distributed environment with privacy preserving
PSDML'10 Proceedings of the international ECML/PKDD conference on Privacy and security issues in data mining and machine learning
Non-black-box computation of linear regression protocols with malicious adversaries
ISPEC'11 Proceedings of the 7th international conference on Information security practice and experience
Privacy Preserving Aggregation of Secret Classifiers
Transactions on Data Privacy
Integer partitioning based encryption for privacy preservation in data mining
Proceedings of the First International Conference on Security of Internet of Things
Hi-index | 0.00 |
Gradient descent is a widely used paradigm for solving many optimization problems. Gradient descent aims to minimize a target function in order to reach a local minimum. In machine learning or data mining, this function corresponds to a decision model that is to be discovered. In this paper, we propose a preliminary formulation of gradient descent with data privacy preservation. We present two approaches—stochastic approach and least square approach—under different assumptions. Four protocols are proposed for the two approaches incorporating various secure building blocks for both horizontally and vertically partitioned data. We conduct experiments to evaluate the scalability of the proposed secure building blocks and the accuracy and efficiency of the protocols for four different scenarios. The excremental results show that the proposed secure building blocks are reasonably scalable and the proposed protocols allow us to determine a better secure protocol for the applications for each scenario.