Improved Boosting Algorithms Using Confidence-rated Predictions
Machine Learning - The Eleventh Annual Conference on computational Learning Theory
Neural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
Combining Artificial Neural Nets: Ensemble and Modular Multi-Net Systems
Combining Artificial Neural Nets: Ensemble and Modular Multi-Net Systems
A Tutorial on Support Vector Machines for Pattern Recognition
Data Mining and Knowledge Discovery
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
Moderating the outputs of support vector machine classifiers
IEEE Transactions on Neural Networks
Real Adaboost Ensembles with Emphasized Subsampling
IWANN '09 Proceedings of the 10th International Work-Conference on Artificial Neural Networks: Part I: Bio-Inspired Systems: Computational and Ambient Intelligence
Hi-index | 0.00 |
We present a novel technique to reduce the computational burden associated to the operational phase of neural networks. To get this, we develop a very simple procedure for fast classification that can be applied to any network whose output is calculated as a weighted sum of terms, which comprises a wide variety of neural schemes, such as multi-net networks and Radial Basis Function (RBF) networks, among many others. Basically, the idea consists on sequentially evaluating the sum terms, using a series of thresholds which are associated to the confidence that a partial output will coincide with the overall network classification criterion. The possibilities of this strategy are well-illustrated by some experiments on a benchmark of binary classification problems, using RealAdaboost and RBF networks as the underlying technologies.