Using output codes to boost multiclass learning problems
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
A decision-theoretic generalization of on-line learning and an application to boosting
EuroCOLT '95 Proceedings of the Second European Conference on Computational Learning Theory
Content-Based Image Retrieval with HSV Color Space and Texture Features
WISM '09 Proceedings of the 2009 International Conference on Web Information Systems and Mining
A mathematical analysis of the DCT coefficient distributions for images
IEEE Transactions on Image Processing
Perceptually uniform color spaces for color texture analysis: an empirical evaluation
IEEE Transactions on Image Processing
A new texture generation method based on pseudo-DCT coefficients
IEEE Transactions on Image Processing
Texture Analysis and Classification With Linear Regression Model Based on Wavelet Transform
IEEE Transactions on Image Processing
Texture analysis and classification with tree-structured wavelet transform
IEEE Transactions on Image Processing
Centroid neural network for unsupervised competitive learning
IEEE Transactions on Neural Networks
Centroid Neural Network With a Divergence Measure for GPDF Data Clustering
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
A multi-class classifier-based AdaBoost algorithm for the efficient classification of multi-class data is proposed in this paper. The traditional AdaBoost algorithm is basically a binary classifier and it has limitations when applied to multi-class data problems even though its multi-class versions are available. In order to overcome the problems of the AdaBoost algorithm for multi-class classification problems, we devise a AdaBoost architecture with its training algorithm that uses multi-class classifiers for its weak classifiers instead of series of binary classifiers. The proposed AdaBoost architecture can save its training time drastically and obtain more stable and more accurate classification results than a typical multi-class AdaBoost architecture based on binary weak classifiers. Experiments on an image classification problem with collected satellite image database are preformed. The results show that the proposed AdaBoost architecture can reduce its training time 50%- 70% depending on the number of training rounds while maintaining its classification accuracy competitive when compared to Adaboost.M2.