On-Line Error Detection of Annotated Corpus Using Modular Neural Networks
ICANN '01 Proceedings of the International Conference on Artificial Neural Networks
Massively Parallel Classification of EEG Signals Using Min-Max Modular Neural Networks
ICANN '01 Proceedings of the International Conference on Artificial Neural Networks
The Journal of Machine Learning Research
Hierarchical Incremental Class Learning with Reduced Pattern Training
Neural Processing Letters
Multi-class pattern classification using neural networks
Pattern Recognition
Fuzzy feature selection based on min-max learning rule and extension matrix
Pattern Recognition
Intelligent Data Analysis
Distributed Nearest Neighbor-Based Condensation of Very Large Data Sets
IEEE Transactions on Knowledge and Data Engineering
Comparing Combination Rules of Pairwise Neural Networks Classifiers
Neural Processing Letters
A theoretical framework for multiple neural network systems
Neurocomputing
Variations of the two-spiral task
Connection Science
Label ranking by learning pairwise preferences
Artificial Intelligence
Multilabel classification via calibrated label ranking
Machine Learning
Neural Information Processing
Partial Discriminative Training of Neural Networks for Classification of Overlapping Classes
ANNPR '08 Proceedings of the 3rd IAPR workshop on Artificial Neural Networks in Pattern Recognition
A Sieving ANN for Emotion-Based Movie Clip Classification
IEICE - Transactions on Information and Systems
Incorporating Prior Knowledge into Task Decomposition for Large-Scale Patent Classification
ISNN 2009 Proceedings of the 6th International Symposium on Neural Networks: Advances in Neural Networks - Part II
Computational Statistics & Data Analysis
Troika - An improved stacking schema for classification tasks
Information Sciences: an International Journal
Tree architecture pattern distributor: a task decomposition classification approach
IJCNN'09 Proceedings of the 2009 international joint conference on Neural Networks
A class decomposition approach for GA-based classifiers
Engineering Applications of Artificial Intelligence
Incremental learning of support vector machines by classifier combining
PAKDD'07 Proceedings of the 11th Pacific-Asia conference on Advances in knowledge discovery and data mining
VISUAL'07 Proceedings of the 9th international conference on Advances in visual information systems
ICONIP'10 Proceedings of the 17th international conference on Neural information processing: theory and algorithms - Volume Part I
A modular decision-tree architecture for better problem understanding
SEAL'10 Proceedings of the 8th international conference on Simulated evolution and learning
Improvement on response performance of min-max modular classifier by symmetric module selection
ISNN'05 Proceedings of the Second international conference on Advances in neural networks - Volume Part II
Learning from label preferences
DS'11 Proceedings of the 14th international conference on Discovery science
Multi-view face recognition with min-max modular SVMs
ICNC'05 Proceedings of the First international conference on Advances in Natural Computation - Volume Part II
Gender recognition using a min-max modular support vector machine
ICNC'05 Proceedings of the First international conference on Advances in Natural Computation - Volume Part II
A modular k-nearest neighbor classification method for massively parallel text categorization
CIS'04 Proceedings of the First international conference on Computational and Information Science
Typical sample selection and redundancy reduction for min-max modular network with GZC function
ISNN'05 Proceedings of the Second international conference on Advances in Neural Networks - Volume Part I
Structure pruning strategies for min-max modular network
ISNN'05 Proceedings of the Second international conference on Advances in Neural Networks - Volume Part I
A hierarchical and parallel method for training support vector machines
ISNN'05 Proceedings of the Second international conference on Advances in Neural Networks - Volume Part I
Task decomposition using geometric relation for min-max modular SVMs
ISNN'05 Proceedings of the Second international conference on Advances in Neural Networks - Volume Part I
Gender recognition using a min-max modular support vector machine with equal clustering
ISNN'06 Proceedings of the Third international conference on Advnaces in Neural Networks - Volume Part II
Prediction of protein subcellular multi-locations with a min-max modular support vector machine
ISNN'06 Proceedings of the Third international conference on Advances in Neural Networks - Volume Part III
A modular reduction method for k-NN algorithm with self-recombination learning
ISNN'06 Proceedings of the Third international conference on Advances in Neural Networks - Volume Part I
An algorithm for pruning redundant modules in min-max modular network with GZC function
ICNC'05 Proceedings of the First international conference on Advances in Natural Computation - Volume Part I
A general procedure for combining binary classifiers and its performance analysis
ICNC'05 Proceedings of the First international conference on Advances in Natural Computation - Volume Part I
One-against-all ensemble for multiclass pattern classification
Applied Soft Computing
Pruning training samples using a supervised clustering algorithm
ISNN'10 Proceedings of the 7th international conference on Advances in Neural Networks - Volume Part II
Improving neural networks classification through chaining
ICANN'12 Proceedings of the 22nd international conference on Artificial Neural Networks and Machine Learning - Volume Part II
Parallel approaches to machine learning-A comprehensive survey
Journal of Parallel and Distributed Computing
Modular Neural Tile Architecture for Compact Embedded Hardware Spiking Neural Network
Neural Processing Letters
Hi-index | 0.00 |
We propose a method for decomposing pattern classification problems based on the class relations among training data. By using this method, we can divide a K-class classification problem into a series of (2K) two-class problems. These two-class problems are to discriminate class Ci from class Cj for i=1, …, K and j=i+1, while the existence of the training data belonging to the other K-2 classes is ignored. If the two-class problem of discriminating class Ci from class Cj is still hard to be learned, we can further break down it into a set of two-class subproblems as small as we expect. Since each of the two-class problems can be treated as a completely separate classification problem with the proposed learning framework, all of the two-class problems can be learned in parallel. We also propose two module combination principles which give practical guidelines in integrating individual trained network modules. After learning of each of the two-class problems with a network module, we can easily integrate all of the trained modules into a min-max modular (M3) network according to the module combination principles and obtain a solution to the original problem. Consequently, a large-scale and complex K-class classification problem can be solved effortlessly and efficiently by learning a series of smaller and simpler two-class problems in parallel