Learning internal representations by error propagation
Parallel distributed processing: explorations in the microstructure of cognition, vol. 1
C4.5: programs for machine learning
C4.5: programs for machine learning
R-MINI: An Iterative Approach for Generating Minimal Rules from Examples
IEEE Transactions on Knowledge and Data Engineering
A Mathematical Theory of Communication
A Mathematical Theory of Communication
Combining Feature Selection with Feature Weighting for k-NN Classifier
IDEAL '02 Proceedings of the Third International Conference on Intelligent Data Engineering and Automated Learning
Missing Value Estimation Based on Dynamic Attribute Selection
PADKK '00 Proceedings of the 4th Pacific-Asia Conference on Knowledge Discovery and Data Mining, Current Issues and New Applications
Hi-index | 0.00 |
Knowledge discovery from raw data is very important to non-experts and even experts who feel difficulty in expressing their skills in machine interpretable forms. However, real world data often contain some redundant or unnecessary features, and if they are directly used, the quality of the seeking knowledge may be much degraded. Here, a new technique of dynamically selecting features is suggested. Contrary to the static feature selection, this scheme selects each new feature based on its correlation with the previously selected features. In addition, this scheme does not require setting any threshold, which would be too difficult to decide. Experiments have been conducted for some real world domains in terms of tree sizes and test data error rates. The results show the soundness of this scheme.