C4.5: programs for machine learning
C4.5: programs for machine learning
Machine Learning
Histograms of Oriented Gradients for Human Detection
CVPR '05 Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05) - Volume 1 - Volume 01
Improvements to Platt's SMO Algorithm for SVM Classifier Design
Neural Computation
Data Mining: Practical Machine Learning Tools and Techniques, Second Edition (Morgan Kaufmann Series in Data Management Systems)
ICANN/ICONIP'03 Proceedings of the 2003 joint international conference on Artificial neural networks and neural information processing
A low variance error boosting algorithm
Applied Intelligence
AAAI'96 Proceedings of the thirteenth national conference on Artificial intelligence - Volume 1
Boosting learning and inference in Markov logic through metaheuristics
Applied Intelligence
Learning Speaker-Specific Characteristics With a Deep Neural Architecture
IEEE Transactions on Neural Networks
An SVM-AdaBoost facial expression recognition system
Applied Intelligence
Hi-index | 0.00 |
Learning problems over high dimensional data are common in real world applications. In this study, a聽challenging, large and lifelike database, the German traffic sign benchmark data, containing 43 classes and 51840 images, is used to demonstrate the strength of our proposed boosted support vector machine with deep learning architecture. Recognition of traffic signs is difficult, and it involves multiple categories, contains subsets of classes that may appear very similar to each other, and tends to have large variations within class in visual appearances due to illumination changes, partial occlusions, rotations and weather conditions. By combining a low variance error boosting algorithm, a low bias error support vector machine and deep learning architecture, an efficient and effective boosting support vector machine method is presented. It has been shown to greatly reduce data dimension and build classification models with higher prediction accuracy while utilizing fewer features and training instances. In evaluation, the proposed method outperforms Adaboost.M1, cw-Boost, and support vector machine, and it achieves ultra fast processing time (0.0038 per prediction) and high accuracy (93.5聽%) on prediction of separate test data utilizes less than 35聽% of the training instances. Moreover, the method is applicable to a standard standalone PC without requiring super computers with enormous memory spaces.