The nature of statistical learning theory
The nature of statistical learning theory
Fast training of support vector machines using sequential minimal optimization
Advances in kernel methods
Proximal support vector machine classifiers
Proceedings of the seventh ACM SIGKDD international conference on Knowledge discovery and data mining
A Tutorial on Support Vector Machines for Pattern Recognition
Data Mining and Knowledge Discovery
Efficient SVM Regression Training with SMO
Machine Learning
Support Vector Machines for Texture Classification
IEEE Transactions on Pattern Analysis and Machine Intelligence
A parallel mixture of SVMs for very large scale problems
Neural Computation
Support Vector Mixture for Classification and Regression Problems
ICPR '98 Proceedings of the 14th International Conference on Pattern Recognition-Volume 1 - Volume 1
Kernel Methods for Pattern Analysis
Kernel Methods for Pattern Analysis
Fast SVM Training Algorithm with Decomposition on Very Large Data Sets
IEEE Transactions on Pattern Analysis and Machine Intelligence
Training linear SVMs in linear time
Proceedings of the 12th ACM SIGKDD international conference on Knowledge discovery and data mining
Multi-Classifier Systems: Review and a roadmap for developers
International Journal of Hybrid Intelligent Systems
Fusion of neural networks with fuzzy logic and genetic algorithm
Integrated Computer-Aided Engineering
Neural network combination by fuzzy integral for robust change detection in remotely sensed imagery
EURASIP Journal on Applied Signal Processing
Fast and accurate text classification via multiple linear discriminant projections
VLDB '02 Proceedings of the 28th international conference on Very Large Data Bases
Successive overrelaxation for support vector machines
IEEE Transactions on Neural Networks
A comparison of methods for multiclass support vector machines
IEEE Transactions on Neural Networks
Hi-index | 0.01 |
The major drawback of Support Vector Machines (SVMs) consists of the training time, which is at least quadratic to the number of data. Among the multitude of approaches developed to alleviate this limitation, several research works showed that mixtures of experts can drastically reduce the runtime of SVMs. The mixture employs a set of SVMs each of which is trained on a sub-set of the original dataset while the final decision is evaluated throughout a gater. The present work proposes a new support vector mixture in which Sugeno's fuzzy integral is used as a gater to remove the time complexity induced by conventional gaters such as artificial neural networks. Experiments conducted on standard datasets of optical character and face recognition reveal that the proposed approach gives a significant reduction of the runtime while keeping at least the same accuracy as the SVM trained over the whole dataset.