C4.5: programs for machine learning
C4.5: programs for machine learning
A Mathematical Theory of Communication
A Mathematical Theory of Communication
Feature Extraction Using Information-Theoretic Learning
IEEE Transactions on Pattern Analysis and Machine Intelligence
Learning highly non-separable Boolean functions using constructive feedforward neural network
ICANN'07 Proceedings of the 17th international conference on Artificial neural networks
Generalized information potential criterion for adaptive system training
IEEE Transactions on Neural Networks
A hybrid system with regression trees in steel-making process
HAIS'11 Proceedings of the 6th international conference on Hybrid artificial intelligent systems - Volume Part I
Expert Systems with Applications: An International Journal
Hi-index | 0.00 |
Shannon entropy used in standard top-down decision trees does not guarantee the best generalization. Split criteria based on generalized entropies offer different compromise between purity of nodes and overall information gain. Modified C4.5 decision trees based on Tsallis and Renyi entropies have been tested on several high-dimensional microarray datasets with interesting results. This approach may be used in any decision tree and information selection algorithm.