Hierarchical mixtures of experts and the EM algorithm
Neural Computation
The nature of statistical learning theory
The nature of statistical learning theory
Machine Learning
Bias/variance analyses of mixtures-of-experts architectures
Neural Computation
Data Mining for Features Using Scale-Sensitive Gated Experts
IEEE Transactions on Pattern Analysis and Machine Intelligence
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
Neural Networks: A Comprehensive Foundation
Neural Networks: A Comprehensive Foundation
Combining Artificial Neural Nets: Ensemble and Modular Multi-Net Systems
Combining Artificial Neural Nets: Ensemble and Modular Multi-Net Systems
A parallel mixture of SVMs for very large scale problems
Neural Computation
Mixture of Experts Applied to Nonlinear Dynamic Systems Identification: A Comparative Study
SBRN '02 Proceedings of the VII Brazilian Symposium on Neural Networks (SBRN'02)
Support Vector Mixture for Classification and Regression Problems
ICPR '98 Proceedings of the 14th International Conference on Pattern Recognition-Volume 1 - Volume 1
Kernel Methods for Pattern Analysis
Kernel Methods for Pattern Analysis
Support Vector Machines for Pattern Classification (Advances in Pattern Recognition)
Support Vector Machines for Pattern Classification (Advances in Pattern Recognition)
Intelligent Control Systems Using Computational Intelligence Techniques (IEE Control Series) (IEE Control Series)
Adaptive mixtures of local experts
Neural Computation
Robust expectation maximization learning algorithm for mixture of experts
IWANN'03 Proceedings of the Artificial and natural neural networks 7th international conference on Computational methods in neural modeling - Volume 1
Structurally adaptive modular networks for nonstationary environments
IEEE Transactions on Neural Networks
Identification and control of dynamical systems using neural networks
IEEE Transactions on Neural Networks
Mindful: A framework for Meta-INDuctive neuro-FUzzy Learning
Information Sciences: an International Journal
Support vector regression from simulation data and few experimental samples
Information Sciences: an International Journal
Gaussian case-based reasoning for business failure prediction with empirical data in China
Information Sciences: an International Journal
Minimal model dimension/order determination algorithms for recurrent neural networks
Pattern Recognition Letters
Information Sciences: an International Journal
Global/local hybrid learning of mixture-of-experts from labeled and unlabeled data
HAIS'11 Proceedings of the 6th international conference on Hybrid artificial intelligent systems - Volume Part I
Supporting system for detecting pathologies
IWANN'11 Proceedings of the 11th international conference on Artificial neural networks conference on Advances in computational intelligence - Volume Part II
Application of mixture of experts to construct real estate appraisal models
HAIS'10 Proceedings of the 5th international conference on Hybrid Artificial Intelligence Systems - Volume Part I
A multi-agent system for web-based risk management in small and medium business
Expert Systems with Applications: An International Journal
Expert Systems with Applications: An International Journal
Probabilistic support vector machines for classification of noise affected data
Information Sciences: an International Journal
Information Sciences: an International Journal
Investigation of mixture of experts applied to residential premises valuation
ACIIDS'13 Proceedings of the 5th Asian conference on Intelligent Information and Database Systems - Volume Part II
Information Sciences: an International Journal
Embedded local feature selection within mixture of experts
Information Sciences: an International Journal
Hi-index | 0.07 |
Mixture of experts (ME) models comprise a family of modular neural network architectures aiming at distilling complex problems into simple subtasks. This is done by deploying a separate gating module for softly dividing the input space into overlapping regions to be each assigned to one or more expert networks. Conversely, support vector machines (SVMs) refer to kernel-based methods, neural-network-alike models that constitute an approximate implementation of the structural risk minimization principle. Such learning machines follow the simple, but powerful idea of nonlinearly mapping input data into high-dimensional feature spaces wherein a linear decision surface discriminating different regions is properly designed. In this work, we formally characterize and empirically evaluate a novel approach, named as Mixture of Support Vector Machine Experts (MSVME), whose main purpose is to combine the complementary properties of both SVM and ME models. In the formal characterization, an algorithm based on a maximum likelihood criterion is considered for the MSVME training, and we demonstrate that it is possible to train each expert based on an SVM perspective. Regarding the empirical evaluation, simulation results involving nonlinear dynamic system identification problems are reported, contrasting the performance shown by the MSVME approach with that exhibited by conventional SVM and ME models.