The nature of statistical learning theory
The nature of statistical learning theory
Machine Learning
MetaCost: a general method for making classifiers cost-sensitive
KDD '99 Proceedings of the fifth ACM SIGKDD international conference on Knowledge discovery and data mining
Advances in Large Margin Classifiers
Advances in Large Margin Classifiers
A Tutorial on Support Vector Machines for Pattern Recognition
Data Mining and Knowledge Discovery
A Novel Measure for Quantifying the Topology Preservation of Self-Organizing Feature Maps
Neural Processing Letters
Text Categorization with Suport Vector Machines: Learning with Many Relevant Features
ECML '98 Proceedings of the 10th European Conference on Machine Learning
AdaCost: Misclassification Cost-Sensitive Boosting
ICML '99 Proceedings of the Sixteenth International Conference on Machine Learning
A Brief Introduction to Boosting
IJCAI '99 Proceedings of the Sixteenth International Joint Conference on Artificial Intelligence
A Method to Boost Support Vector Machines
PAKDD '02 Proceedings of the 6th Pacific-Asia Conference on Advances in Knowledge Discovery and Data Mining
Sequential cost-sensitive decision making with reinforcement learning
Proceedings of the eighth ACM SIGKDD international conference on Knowledge discovery and data mining
Machine Learning
Cost-Sensitive Learning by Cost-Proportionate Example Weighting
ICDM '03 Proceedings of the Third IEEE International Conference on Data Mining
Improvement of Boosting Algorithm by Modifying the Weighting Rule
Annals of Mathematics and Artificial Intelligence
An iterative method for multi-class cost-sensitive learning
Proceedings of the tenth ACM SIGKDD international conference on Knowledge discovery and data mining
IEEE Transactions on Knowledge and Data Engineering
Cost-sensitive learning and decision making for massachusetts pip claim fraud data
International Journal of Intelligent Systems
Data Complexity in Pattern Recognition (Advanced Information and Knowledge Processing)
Data Complexity in Pattern Recognition (Advanced Information and Knowledge Processing)
The foundations of cost-sensitive learning
IJCAI'01 Proceedings of the 17th international joint conference on Artificial intelligence - Volume 2
Image and Vision Computing
Margin calibration in SVM class-imbalanced learning
Neurocomputing
Hi-index | 0.01 |
Since the loss function is so important in statistical learning, this paper proposes the concept of adding heavier penalties to the heterogeneous examples of a dataset to achieve a stricter convex loss function for optimization. The concept was realized by changing the class labels of support vector machines (SVM) into greater real values. Using the magnified real-valued class labels to convey the additional penalties, an elementary stage-wise classifier was developed to achieve a high training accuracy. In this article, the original theory and induced corresponding properties of the stage-wise classifier are presented for further applications. Two types of re-weighting rules were devised in the connection of consecutive stages to produce the heavier penalties. Compared to a qualified underlying prototype, the empirical results showed that the classification complexity of the proposed classifier was increased accordingly as the accuracy of the classifier was improved due to various additional penalties. Although the stricter penalties might cause an undesirable over-fitting, the flexible re-weighting strategy is still beneficial for some application.