The mathematical foundations of learning machines
The mathematical foundations of learning machines
Boosting a weak learning algorithm by majority
Information and Computation
Data Mining and Knowledge Discovery
Reducing Communication for Distributed Learning in Neural Networks
ICANN '02 Proceedings of the International Conference on Artificial Neural Networks
SMOTE: synthetic minority over-sampling technique
Journal of Artificial Intelligence Research
Neural fraud detection in credit card operations
IEEE Transactions on Neural Networks
FLSOM with Different Rates for Classification in Imbalanced Datasets
ICANN '08 Proceedings of the 18th international conference on Artificial Neural Networks, Part I
The Complexity of the Batch Neural Gas Extended to Local PCA
IWANN '09 Proceedings of the 10th International Work-Conference on Artificial Neural Networks: Part I: Bio-Inspired Systems: Computational and Ambient Intelligence
Parallel-series perceptrons for the simultaneous determination of odor classes and concentrations
ICANN'07 Proceedings of the 17th international conference on Artificial neural networks
Hi-index | 0.01 |
A natural way to deal with training samples in imbalanced class problems is to prune them removing redundant patterns, easy to classify and probably over represented, and label noisy patterns that belonging to one class are labelled as members of another. This allows classifier construction to focus on borderline patterns, likely to be the most informative ones. To appropriately define the above subsets, in this work we will use as base classifiers the so–called parallel perceptrons, a novel approach to committee machine training that allows, among other things, to naturally define margins for hidden unit activations. We shall use these margins to define the above pattern types and to iteratively perform subsample selections in an initial training set that enhance classification accuracy and allow for a balanced classifier performance even when class sizes are greatly different.