Prediction of Enzyme Classification from Protein Sequence without the Use of Sequence Similarity
Proceedings of the 5th International Conference on Intelligent Systems for Molecular Biology
Methods for Designing Multiple Classifier Systems
MCS '01 Proceedings of the Second International Workshop on Multiple Classifier Systems
A statistical framework for genomic data fusion
Bioinformatics
A note on Platt's probabilistic outputs for support vector machines
Machine Learning
Engineering multiversion neural-net systems
Neural Computation
Multiple classifier systems in remote sensing: from basics to recent developments
MCS'07 Proceedings of the 7th international conference on Multiple classifier systems
Hi-index | 0.00 |
Machine Learning can be divided into two schools of thought: generative model learning and discriminative model learning. While the MCS community has been focused mainly on the latter, our paper is concerned with questions that arise from ensembles of generative models. Generative models provide us with neat ways of thinking about two interesting learning issues: model selection and semi-supervised learning. Preliminary results show that for semi-supervised low-variance generative models, traditional MCS techniques like Bagging and Random Subspace Method (RSM) do not outperform the single classifier approach. However, RSM introduces diversity between base classifiers. This starting point suggests that diversity between base components has to lie within the structure of the base classifier, and not in the dataset, and it highlights the need for novel generative ensemble learning techniques.