Regularization theory and neural networks architectures
Neural Computation
Tools for privacy preserving distributed data mining
ACM SIGKDD Explorations Newsletter
Data Mining: Next Generation Challenges and Future Directions
Data Mining: Next Generation Challenges and Future Directions
Distributed Data Mining in Peer-to-Peer Networks
IEEE Internet Computing
Distributed classification in peer-to-peer networks
Proceedings of the 13th ACM SIGKDD international conference on Knowledge discovery and data mining
Distributed Decision-Tree Induction in Peer-to-Peer Systems
Statistical Analysis and Data Mining
Collaborative filtering using random neighbours in peer-to-peer networks
Proceedings of the 1st ACM international workshop on Complex networks meet information & knowledge management
Gossiping personalized queries
Proceedings of the 13th International Conference on Extending Database Technology
Ensemble Methods in Data Mining: Improving Accuracy Through Combining Predictions
Ensemble Methods in Data Mining: Improving Accuracy Through Combining Predictions
Selective neural network ensemble based on clustering
ISNN'06 Proceedings of the Third international conference on Advances in Neural Networks - Volume Part I
Hi-index | 0.00 |
This work describes a fully asynchronous and privacy preserving ensemble selection approach for distributed data mining in peer-to-peer applications. The algorithm builds a global ensemble model over large amounts of data distributed over the peers in a network, without moving the data itself, and with little centralized coordination. Only classifiers are transmitted to other peers. Here the test set from one classifier is the train set of the other and vice versa. Regularization Networks are used as ensemble member classifiers. The approach constructs a mapping of all ensemble members to a mutual affinity matrix based on classification rates between them. After the mapping of all members the Affinity Propagation clustering algorithm is used for the selection phase. A classical asynchronous peer-to-peer cycle is continually executed for computing the mutual affinity matrix. The cycle composed of typical grid commands, like send local classifier to a peer k, check for received classifier m in the queue, compute local average positive hits, send results to peer m and send local classifier to a peer k+1. Thus the communication model used is simple point-to-point with send-receive commands to or from a single peer. The approach can also be implemented to other types of classifiers.