Neural Computation
Information-based objective functions for active data selection
Neural Computation
Learning and evaluating classifiers under sample selection bias
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Algebraic Analysis for Nonidentifiable Learning Machines
Neural Computation
The Journal of Machine Learning Research
Covariate Shift Adaptation by Importance Weighted Cross Validation
The Journal of Machine Learning Research
Active learning with statistical models
Journal of Artificial Intelligence Research
Paper: Modeling by shortest data description
Automatica (Journal of IFAC)
Statistical active learning in multilayer perceptrons
IEEE Transactions on Neural Networks
Pool-based active learning in approximate linear regression
Machine Learning
Complexity bounds for batch active learning in classification
ECML PKDD'10 Proceedings of the 2010 European conference on Machine learning and knowledge discovery in databases: Part III
Hi-index | 0.00 |
Optimally designing the location of training input points (active learning) and choosing the best model (model selection)-which have been extensively studied-are two important components of supervised learning. However, these two issues seem to have been investigated separately as two independent problems. If training input points and models are simultaneously optimized, the generalization performance would be further improved. In this paper, we propose a new approach called ensemble active learning for solving the problems of active learning and model selection at the same time. We demonstrate by numerical experiments that the proposed method compares favorably with alternative approaches such as iteratively performing active learning and model selection in a sequential manner.