Machine Learning - Special issue on learning with probabilistic representations
Pattern Recognition and Neural Networks
Pattern Recognition and Neural Networks
Assessing a Mixture Model for Clustering with the Integrated Completed Likelihood
IEEE Transactions on Pattern Analysis and Machine Intelligence
Learning a Sparse Representation for Object Detection
ECCV '02 Proceedings of the 7th European Conference on Computer Vision-Part IV
Conditional Random Fields: Probabilistic Models for Segmenting and Labeling Sequence Data
ICML '01 Proceedings of the Eighteenth International Conference on Machine Learning
Classifier Learning with Supervised Marginal Likelihood
UAI '01 Proceedings of the 17th Conference in Uncertainty in Artificial Intelligence
Eighteenth national conference on Artificial intelligence
Discriminative, generative and imitative learning
Discriminative, generative and imitative learning
Selection of Scale-Invariant Parts for Object Class Recognition
ICCV '03 Proceedings of the Ninth IEEE International Conference on Computer Vision - Volume 2
Distinctive Image Features from Scale-Invariant Keypoints
International Journal of Computer Vision
Learning Bayesian network classifiers by maximizing conditional likelihood
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Hierarchical Part-Based Visual Object Categorization
CVPR '05 Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05) - Volume 1 - Volume 01
Sports video mining via multichannel segmental hidden Markov models
IEEE Transactions on Multimedia
Development of an adaptive neuro-fuzzy classifier using linguistic hedges: Part 1
Expert Systems with Applications: An International Journal
Derivations of normalized mutual information in binary classifications
FSKD'09 Proceedings of the 6th international conference on Fuzzy systems and knowledge discovery - Volume 1
Intrinsic dimension estimation by maximum likelihood in isotropic probabilistic PCA
Pattern Recognition Letters
Statistical mixture model for documents skew angle estimation
Pattern Recognition Letters
Infinite Liouville mixture models with application to text and texture categorization
Pattern Recognition Letters
A predictive deviance criterion for selecting a generative model in semi-supervised classification
Computational Statistics & Data Analysis
Hi-index | 0.15 |
This paper is concerned with the selection of a generative model for supervised classification. Classical criteria for model selection assess the fit of a model rather than its ability to produce a low classification error rate. A new criterion, the Bayesian Entropy Criterion (BEC), is proposed. This criterion takes into account the decisional purpose of a model by minimizing the integrated classification entropy. It provides an interesting alternative to the cross-validated error rate which is computationally expensive. The asymptotic behavior of the BEC criterion is presented. Numerical experiments on both simulated and real data sets show that BEC performs better than the BIC criterion to select a model minimizing the classification error rate and provides analogous performance to the cross-validated error rate.