Letter Recognition Using Holland-Style Adaptive Classifiers
Machine Learning
Handwritten digit recognition with a back-propagation network
Advances in neural information processing systems 2
Independent component analysis, a new concept?
Signal Processing - Special issue on higher order statistics
The nature of statistical learning theory
The nature of statistical learning theory
Selection of relevant features and examples in machine learning
Artificial Intelligence - Special issue on relevance
Are Multilayer Perceptrons Adequate for Pattern Recognition and Verification?
IEEE Transactions on Pattern Analysis and Machine Intelligence
Neural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
Efficient Pattern Recognition Using a New Transformation Distance
Advances in Neural Information Processing Systems 5, [NIPS Conference]
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
Neural Learning from Unbalanced Data
Applied Intelligence
Multi-class pattern classification using neural networks
Pattern Recognition
Backpropagation applied to handwritten zip code recognition
Neural Computation
A rule-based scheme for filtering examples from majority class in an imbalanced training set
MLDM'03 Proceedings of the 3rd international conference on Machine learning and data mining in pattern recognition
Parallel perceptrons, activation margins and imbalanced training set pruning
IbPRIA'05 Proceedings of the Second Iberian conference on Pattern Recognition and Image Analysis - Volume Part II
Comparing support vector machines with Gaussian kernels to radialbasis function classifiers
IEEE Transactions on Signal Processing
Pattern classification using neural networks
IEEE Communications Magazine
Neural-network classifiers for recognizing totally unconstrained handwritten numerals
IEEE Transactions on Neural Networks
A self-organizing neural tree for large-set pattern classification
IEEE Transactions on Neural Networks
Data mining in soft computing framework: a survey
IEEE Transactions on Neural Networks
Input feature selection for classification problems
IEEE Transactions on Neural Networks
A comparison of methods for multiclass support vector machines
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
Algorithms for nonnegative independent component analysis
IEEE Transactions on Neural Networks
A constructive algorithm for training cooperative neural network ensembles
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
An improved algorithm for neural network classification of imbalanced training sets
IEEE Transactions on Neural Networks
Efficient classification for multiclass problems using modular neural networks
IEEE Transactions on Neural Networks
Domains of Competence of Artificial Neural Networks Using Measures of Separability of Classes
IWANN '09 Proceedings of the 10th International Work-Conference on Artificial Neural Networks: Part I: Bio-Inspired Systems: Computational and Ambient Intelligence
Using Gaussian Processes in Bayesian Robot Programming
IWANN '09 Proceedings of the 10th International Work-Conference on Artificial Neural Networks: Part II: Distributed Computing, Artificial Intelligence, Bioinformatics, Soft Computing, and Ambient Assisted Living
A Robust Modular Wavelet Network Based Symbol Classifier
ICIAR '09 Proceedings of the 6th International Conference on Image Analysis and Recognition
Information Sciences: an International Journal
Hi-index | 0.01 |
One of keys for multilayer perceptrons (MLPs) to solve the multi-class learning problems is how to make them get good convergence and generalization performances merely through learning small-scale subsets, i.e., a small part of the original larger-scale data sets. This paper first decomposes an n-class problem into n two-class problems, and then uses n class-modular MLPs to solve them one by one. A class-modular MLP is responsible for forming the decision boundaries of its represented class, and thus can be trained only by the samples from the represented class and some neighboring ones. When solving a two-class problem, an MLP has to face with such unfavorable situations as unbalanced training data, locally sparse and weak distribution regions, and open decision boundaries. One of solutions is that the samples from the minority classes or in the thin regions are virtually reinforced by suitable enlargement factors. And next, the effective range of an MLP is localized by a correction coefficient related to the distribution of its represented class. In brief, this paper focuses on the formation of economic learning subsets, the virtual balance of imbalanced training sets, and the localization of generalization regions of MLPs. The results for the letter and the extended handwritten digital recognitions show that the proposed methods are effective.