A training algorithm for optimal margin classifiers
COLT '92 Proceedings of the fifth annual workshop on Computational learning theory
The nature of statistical learning theory
The nature of statistical learning theory
Machine Learning
Machine Learning
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
Linear Programming Boosting via Column Generation
Machine Learning
IEEE Transactions on Pattern Analysis and Machine Intelligence
Parallel consensual neural networks
IEEE Transactions on Neural Networks
A comparison of methods for multiclass support vector machines
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
This letter presents the results of two different ensemble approaches to increase the accuracy of land cover classification using support vector machines. Finite ensemble approaches, based on boosting and bagging and infinite ensemble created by embedding the infinite hypothesis in the kernel of support vector machines, are discussed. Results suggest that the infinite ensemble approach provides a significant increase in the classification accuracy in comparison to the radial basis function kernel-based support vector machines. While using finite ensemble approaches, bagging works well and provides a comparable performance to the infinite ensemble approach, whereas boosting decreases the performance of support vector machines. Comparison in terms of computational cost suggests that finite ensemble approaches require a large processing time in comparison to the infinite ensemble approach.