Fuzzy sets, uncertainty, and information
Fuzzy sets, uncertainty, and information
Evaluation of adaptive mixtures of competing experts
NIPS-3 Proceedings of the 1990 conference on Advances in neural information processing systems 3
Hierarchical mixtures of experts and the EM algorithm
Neural Computation
Machine Learning
Statistical Pattern Recognition: A Review
IEEE Transactions on Pattern Analysis and Machine Intelligence
Complexity Measures of Supervised Classification Problems
IEEE Transactions on Pattern Analysis and Machine Intelligence
Neural Networks: A Comprehensive Foundation
Neural Networks: A Comprehensive Foundation
Filter Selection Model for Generating Visual Motion Signals
Advances in Neural Information Processing Systems 5, [NIPS Conference]
Some Solutions to the Missing Feature Problem in Vision
Advances in Neural Information Processing Systems 5, [NIPS Conference]
Analysis and Synthesis of Agents That Learn from Distributed Dynamic Data Sources
Emergent Neural Computational Architectures Based on Neuroscience - Towards Neuroscience-Inspired Computing
Web usage mining: discovery and application of interesting patterns from web data
Web usage mining: discovery and application of interesting patterns from web data
Transductive Methods for the Distributed Ensemble Classification Problem
Neural Computation
Machine learning: a review of classification and combining techniques
Artificial Intelligence Review
Supervised Machine Learning: A Review of Classification Techniques
Proceedings of the 2007 conference on Emerging Artificial Intelligence Applications in Computer Engineering: Real Word AI Systems with Applications in eHealth, HCI, Information Retrieval and Pervasive Technologies
DRILA: a distributed relational inductive learning algorithm
WSEAS Transactions on Computers
A novel multi-view classifier based on Nyström approximation
Expert Systems with Applications: An International Journal
A novel multi-view learning developed from single-view patterns
Pattern Recognition
Hi-index | 0.01 |
In general, pattern classification algorithms assume that all the features are available during the construction of a classifier and its subsequent use. In many practical situations, data are recorded in different servers that are geographically apart, and each server observes features of local interest. The underlying infrastructure and other logistics (such as access control) in many cases do not permit continual synchronization. Each server thus has a partial view of the data in the sense that feature subsets (not necessarily disjoint) are available at each server. In this article, we present a classification algorithm for this distributed vertically partitioned data.We assume that local classifiers can be constructed based on the local partial views of the data available at each server. These local classifiers can be any one of the many standard classifiers (e.g., neural networks, decision tree, k nearest neighbor). Often these local classifiers are constructed to support decision making at each location, and our focus is not on these individual local classifiers. Rather, our focus is constructing a classifier that can use these local classifiers to achieve an error rate that is as close as possible to that of a classifier having access to the entire feature set. We empirically demonstrate the efficacy of the proposed algorithm and also provide theoretical results quantifying the loss that results as compared to the situation where the entire feature set is available to any single classifier.