On the exponential value of labeled samples
Pattern Recognition Letters
Selective Sampling Using the Query by Committee Algorithm
Machine Learning
Multidimensional binary search trees used for associative searching
Communications of the ACM
Neural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
Mixture of experts classification using a hierarchical mixture model
Neural Computation
Employing EM and Pool-Based Active Learning for Text Classification
ICML '98 Proceedings of the Fifteenth International Conference on Machine Learning
Class Conditional Density Estimation Using Mixtures with Constrained Component Sharing
IEEE Transactions on Pattern Analysis and Machine Intelligence
Active learning with statistical models
Journal of Artificial Intelligence Research
Shared kernel models for class conditional density estimation
IEEE Transactions on Neural Networks
An incremental training method for the probabilistic RBF network
IEEE Transactions on Neural Networks
Information Sciences: an International Journal
Hi-index | 0.01 |
In this work we present an active learning methodology for training the probabilistic RBF (PRBF) network. It is a special case of the RBF network, and constitutes a generalization of the Gaussian mixture model. We propose an incremental method for semi-supervised learning based on the Expectation-Maximization (EM) algorithm. Then we present an active learning method that iteratively applies the semi-supervised method for learning the labeled and unlabeled observations concurrently, and then employs a suitable criterion to select an unlabeled observation and query its label. The proposed criterion selects points near the decision boundary, and facilitates the incremental semi-supervised learning that also exploits the decision boundary. The performance of the algorithm in experiments using well-known data sets is promising.