Original Contribution: Stacked generalization
Neural Networks
An Evaluation of Statistical Approaches to Text Categorization
Information Retrieval
Combining Nearest Neighbor Classifiers Through Multiple Feature Subsets
ICML '98 Proceedings of the Fifteenth International Conference on Machine Learning
FeatureBoost: A Meta-Learning Algorithm that Improves Model Robustness
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Using asymmetric distributions to improve text classifier probability estimates
Proceedings of the 26th annual international ACM SIGIR conference on Research and development in informaion retrieval
The Journal of Machine Learning Research
Sensitivity Analysis in Practice: A Guide to Assessing Scientific Models
Sensitivity Analysis in Practice: A Guide to Assessing Scientific Models
RCV1: A New Benchmark Collection for Text Categorization Research
The Journal of Machine Learning Research
Probabilistic score estimation with piecewise logistic regression
ICML '04 Proceedings of the twenty-first international conference on Machine learning
The Combination of Text Classifiers Using Reliability Indicators
Information Retrieval
Bayesian aggregation of probability forecasts on categorical events
Bayesian aggregation of probability forecasts on categorical events
Using query-specific variance estimates to combine Bayesian classifiers
ICML '06 Proceedings of the 23rd international conference on Machine learning
Proceedings of the 12th ACM SIGKDD international conference on Knowledge discovery and data mining
Building reliable metaclassifiers for text learning
Building reliable metaclassifiers for text learning
Hi-index | 0.00 |
We introduce a nonparametric model for sensitivity estimation which relies on generating points similar to the prediction point using its knearest neighbors. Unlike most previous work, the sampled points differ simultaneously in multiple dimensions from the prediction point in a manner dependent on the local density. Our approach is based on an intuitive idea of locality which uses the Voronoi cell around the prediction point, i.e.all points whose nearest neighbor is the prediction point. We demonstrate how an implicit density over this neighborhood can be used in order to compute relative estimates of the local sensitivity. The resulting estimates demonstrate improved performance when used in classifier combination and classifier recalibration as well as being potentially useful in active learning and a variety of other problems.