The subspace information criterion for infinite dimensional hypothesis spaces
The Journal of Machine Learning Research
Confidence-Based Active Learning
IEEE Transactions on Pattern Analysis and Machine Intelligence
Covariate Shift Adaptation by Importance Weighted Cross Validation
The Journal of Machine Learning Research
Pool-Based Agnostic Experiment Design in Linear Regression
ECML PKDD '08 Proceedings of the European conference on Machine Learning and Knowledge Discovery in Databases - Part II
Improving supervised learning performance by using fuzzy clustering method to select training data
Journal of Intelligent & Fuzzy Systems: Applications in Engineering and Technology - Fuzzy theory and technology with applications
Fuzzy model validation using the local statistical approach
Fuzzy Sets and Systems
Pool-based active learning in approximate linear regression
Machine Learning
IEEE Transactions on Image Processing
Optimal Training Sequences for Locally Recurrent Neural Networks
ICANN '09 Proceedings of the 19th International Conference on Artificial Neural Networks: Part I
Engineering Applications of Artificial Intelligence
Active learning for sparse least squares support vector machines
AICI'11 Proceedings of the Third international conference on Artificial intelligence and computational intelligence - Volume Part II
A cluster-assumption based batch mode active learning technique
Pattern Recognition Letters
Wiley Interdisciplinary Reviews: Data Mining and Knowledge Discovery
Sensitivity Analysis for Selective Learning by Feedforward Neural Networks
Fundamenta Informaticae
Hi-index | 0.00 |
Proposes methods for generating input locations actively in gathering training data, aiming at solving problems unique to muitilayer perceptrons. One of the problems is that optimum input locations, which are calculated deterministically, sometimes distribute densely around the same point and cause local minima in backpropagation training. Two probabilistic active learning methods, which utilize the statistical variance of locations, are proposed to solve this problem. One is parametric active learning and the other is multipoint-search active learning. Another serious problem in applying active learning to multilayer perceptrons is that a Fisher information matrix can be singular, while many methods, including the proposed ones, assume its regularity. A technique of pruning redundant hidden units is proposed to keep the Fisher information matrix regular. Combined with this technique, active learning can be applied stably to multilayer perceptrons. The effectiveness of the proposed methods is demonstrated through computer simulations on simple artificial problems and a real-world problem of color conversion