A Method for Attribute Selection in Inductive Learning Systems
IEEE Transactions on Pattern Analysis and Machine Intelligence
On changing continuous attributes into ordered discrete attributes
EWSL-91 Proceedings of the European working session on learning on Machine learning
A training algorithm for optimal margin classifiers
COLT '92 Proceedings of the fifth annual workshop on Computational learning theory
C4.5: programs for machine learning
C4.5: programs for machine learning
Efficient agnostic PAC-learning with simple hypothesis
COLT '94 Proceedings of the seventh annual conference on Computational learning theory
Using analytic QP and sparseness to speed training of support vector machines
Proceedings of the 1998 conference on Advances in neural information processing systems II
Machine Learning
Advances in applying genetic programming to machine learning, focussing on classification problems
IPDPS'06 Proceedings of the 20th international conference on Parallel and distributed processing
Estimating continuous distributions in Bayesian classifiers
UAI'95 Proceedings of the Eleventh conference on Uncertainty in artificial intelligence
Nearest neighbor pattern classification
IEEE Transactions on Information Theory
A Hybrid Higher Order Neural Classifier for handling classification problems
Expert Systems with Applications: An International Journal
Design of real-time fuzzy bus holding system for the mass rapid transit transfer system
Expert Systems with Applications: An International Journal
Hi-index | 12.06 |
In this paper, we present a new method for handling classification problems using a new fuzzy information gain measure. Based on the proposed fuzzy information gain measure, we propose an algorithm for constructing membership functions, calculating the class degree of each subset of training instances with respect to each class and calculating the fuzzy entropy of each subset of training instances. Based on the constructed membership function of each fuzzy set of each feature, the obtained class degree of each subset of training instances with respect to each class and the obtained fuzzy entropy of each subset of training instances, we propose an evaluating function for classifying testing instances. The proposed method gets higher average classification accuracy rates than the methods presented in [John, G. H., & Langley, P. (1995). Estimating continuous distributions in Bayesian classifiers. In Proceedings of the 11th conference on uncertainty in artificial intelligence, Montreal, Canada (pp. 338-345); Platt, J. C. (1999). Using analytic QP and sparseness to speed training of support vector machines. In Proceedings of the 13th annual conference on neural information processing systems, Denver, Colorado (pp. 557-563); Quinlan, J. R. (1993). C4.5: Programs for machine learning. San Francisco: Morgan Kaufmann].