The nature of statistical learning theory
The nature of statistical learning theory
Transductive Inference for Text Classification using Support Vector Machines
ICML '99 Proceedings of the Sixteenth International Conference on Machine Learning
IEEE Transactions on Pattern Analysis and Machine Intelligence
Selection of Generative Models in Classification
IEEE Transactions on Pattern Analysis and Machine Intelligence
Hi-index | 0.03 |
Semi-supervised classification can help to improve generative classifiers by taking into account the information provided by the unlabeled data points, especially when there are far more unlabeled data than labeled data. The aim is to select a generative classification model using both unlabeled and labeled data. A predictive deviance criterion, AIC"c"o"n"d, aiming to select a parsimonious and relevant generative classifier in the semi-supervised context is proposed. In contrast to standard information criteria such as AIC and BIC, AIC"c"o"n"d is focused on the classification task, since it attempts to measure the predictive power of a generative model by approximating its predictive deviance. However, it avoids the computational cost of cross-validation criteria, which make repeated use of the EM algorithm. AIC"c"o"n"d is proved to have consistency properties that ensure its parsimony when compared with the Bayesian Entropy Criterion (BEC), whose focus is similar to that of AIC"c"o"n"d. Numerical experiments on both simulated and real data sets show that the behavior of AIC"c"o"n"d as regards the selection of variables and models, is encouraging when it is compared to the competing criteria.