The nature of statistical learning theory
The nature of statistical learning theory
Machine learning, neural and statistical classification
Machine learning, neural and statistical classification
Large margin classification using the perceptron algorithm
COLT' 98 Proceedings of the eleventh annual conference on Computational learning theory
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
Neural, Parallel & Scientific Computations
Neural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
Incremental Subgradient Methods for Nondifferentiable Optimization
SIAM Journal on Optimization
A Tutorial on Support Vector Machines for Pattern Recognition
Data Mining and Knowledge Discovery
Perceptron Learning Revisited: The Sonar Targets Problem
Neural Processing Letters
Bootstrap Methods for the Cost-Sensitive Evaluation of Classifiers
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
The foundations of cost-sensitive learning
IJCAI'01 Proceedings of the 17th international joint conference on Artificial intelligence - Volume 2
Advances in Artificial Intelligence
Algorithm portfolios based on cost-sensitive hierarchical clustering
IJCAI'13 Proceedings of the Twenty-Third international joint conference on Artificial Intelligence
Hi-index | 0.00 |
This article considers the cost dependent construction of linear and piecewise linear classifiers. Classical learning algorithms from the fields of artificial neural networks and machine learning consider either no costs at all or allow only costs that depend on the classes of the examples that are used for learning. In contrast to class dependent costs, we consider costs that are example, i.e. feature and class dependent. We present a cost sensitive extension of a modified version of the well-known perceptron algorithm that can also be applied in cases, where the classes are linearly non-separable. We also present an extended version of the hybrid learning algorithm DIPOL, that can be applied in the case of linear non-separability, multi-modal class distributions, and multi-class learning problems. We show that the consideration of example dependent costs is a true extension of class dependent costs. The approach is general and can be extended to other neural network architectures like multi-layer perceptrons and radial basis function networks.