Optimization of control parameters for genetic algorithms
IEEE Transactions on Systems, Man and Cybernetics
International Journal of Man-Machine Studies - Special Issue: Knowledge Acquisition for Knowledge-based Systems. Part 5
Applied multivariate statistical analysis
Applied multivariate statistical analysis
The Use of Background Knowledge in Decision Tree Induction
Machine Learning
A Further Comparison of Splitting Rules for Decision-Tree Induction
Machine Learning
C4.5: programs for machine learning
C4.5: programs for machine learning
Cost-sensitive pruning of decision trees
ECML-94 Proceedings of the European conference on machine learning on Machine Learning
Inductive Policy: The Pragmatics of Bias Selection
Machine Learning - Special issue on bias evaluation and selection
Machine Learning
A Comparative Analysis of Methods for Pruning Decision Trees
IEEE Transactions on Pattern Analysis and Machine Intelligence
Wrappers for feature subset selection
Artificial Intelligence - Special issue on relevance
Combining support vector and mathematical programming methods for classification
Advances in kernel methods
MetaCost: a general method for making classifiers cost-sensitive
KDD '99 Proceedings of the fifth ACM SIGKDD international conference on Knowledge discovery and data mining
Use of Contextual Information for Feature Ranking and Discretization
IEEE Transactions on Knowledge and Data Engineering
Goal-Directed Classification Using Linear Machine Decision Trees
IEEE Transactions on Pattern Analysis and Machine Intelligence
Knowledge discovery from data?
IEEE Intelligent Systems
Pruning Decision Trees with Misclassification Costs
ECML '98 Proceedings of the 10th European Conference on Machine Learning
The Case against Accuracy Estimation for Comparing Induction Algorithms
ICML '98 Proceedings of the Fifteenth International Conference on Machine Learning
AdaCost: Misclassification Cost-Sensitive Boosting
ICML '99 Proceedings of the Sixteenth International Conference on Machine Learning
A Comparative Study of Cost-Sensitive Boosting Algorithms
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Reducing Multiclass to Binary: A Unifying Approach for Margin Classifiers
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Boosting Trees for Cost-Sensitive Classifications
ECML '98 Proceedings of the 10th European Conference on Machine Learning
Methods for cost-sensitive learning
Methods for cost-sensitive learning
Cost-Sensitive Learning by Cost-Proportionate Example Weighting
ICDM '03 Proceedings of the Third IEEE International Conference on Data Mining
Simplifying decision trees: A survey
The Knowledge Engineering Review
An iterative method for multi-class cost-sensitive learning
Proceedings of the tenth ACM SIGKDD international conference on Knowledge discovery and data mining
Test Strategies for Cost-Sensitive Decision Trees
IEEE Transactions on Knowledge and Data Engineering
Cost-sensitive feature acquisition and classification
Pattern Recognition
Proceedings of the 24th international conference on Machine learning
Anytime induction of low-cost, low-error classifiers: a sampling-based approach
Journal of Artificial Intelligence Research
A system for induction of oblique decision trees
Journal of Artificial Intelligence Research
Journal of Artificial Intelligence Research
An empirical study of the noise impact on cost-sensitive learning
IJCAI'07 Proceedings of the 20th international joint conference on Artifical intelligence
The foundations of cost-sensitive learning
IJCAI'01 Proceedings of the 17th international joint conference on Artificial intelligence - Volume 2
A survey of cost-sensitive decision tree induction algorithms
ACM Computing Surveys (CSUR)
Expert Systems with Applications: An International Journal
Hi-index | 0.00 |
This article presents a new decision tree learning algorithm called CSNL that induces Cost-Sensitive Non-Linear decision trees. The algorithm is based on the hypothesis that nonlinear decision nodes provide a better basis than axis-parallel decision nodes and utilizes discriminant analysis to construct nonlinear decision trees that take account of costs of misclassification. The performance of the algorithm is evaluated by applying it to seventeen datasets and the results are compared with those obtained by two well known cost-sensitive algorithms, ICET and MetaCost, which generate multiple trees to obtain some of the best results to date. The results show that CSNL performs at least as well, if not better than these algorithms, in more than twelve of the datasets and is considerably faster. The use of bagging with CSNL further enhances its performance showing the significant benefits of using nonlinear decision nodes.