Bayesian and non-Bayesian evidential updating
Artificial Intelligence
C4.5: programs for machine learning
C4.5: programs for machine learning
Machine Learning
Maximum of entropy for credal sets
International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems
Uncertainty and Information: Foundations of Generalized Information Theory
Uncertainty and Information: Foundations of Generalized Information Theory
On Nonparametric Predictive Inference and Objective Bayesianism
Journal of Logic, Language and Information
Data Mining: Practical Machine Learning Tools and Techniques, Second Edition (Morgan Kaufmann Series in Data Management Systems)
Statistical Comparisons of Classifiers over Multiple Data Sets
The Journal of Machine Learning Research
International Journal of Approximate Reasoning
Upper entropy of credal sets. Applications to credal classification
International Journal of Approximate Reasoning
An introduction to the imprecise Dirichlet model for multinomial data
International Journal of Approximate Reasoning
Bagging schemes on the presence of class noise in classification
Expert Systems with Applications: An International Journal
Uncertainty estimation with a finite dataset in the assessment of classification models
Computational Statistics & Data Analysis
A new variable selection approach using Random Forests
Computational Statistics & Data Analysis
Editorial: Special issue on imprecision in statistical data analysis
Computational Statistics & Data Analysis
Expert Systems with Applications: An International Journal
Hi-index | 0.03 |
An application of nonparametric predictive inference for multinomial data (NPI) to classification tasks is presented. This model is applied to an established procedure for building classification trees using imprecise probabilities and uncertainty measures, thus far used only with the imprecise Dirichlet model (IDM), that is defined through the use of a parameter expressing previous knowledge. The accuracy of that procedure of classification has a significant dependence on the value of the parameter used when the IDM is applied. A detailed study involving 40 data sets shows that the procedure using the NPI model (which has no parameter dependence) obtains a better trade-off between accuracy and size of tree than does the procedure when the IDM is used, whatever the choice of parameter. In a bias-variance study of the errors, it is proved that the procedure with the NPI model has a lower variance than the one with the IDM, implying a lower level of over-fitting.