C4.5: programs for machine learning
C4.5: programs for machine learning
Reduction Techniques for Instance-BasedLearning Algorithms
Machine Learning
Feature Selection via Discretization
IEEE Transactions on Knowledge and Data Engineering
Concept acquisition through representational adjustment
Concept acquisition through representational adjustment
Dimensionality reduction via sparse support vector machines
The Journal of Machine Learning Research
Comparing rank-inducing scoring systems
ICPR '06 Proceedings of the 18th International Conference on Pattern Recognition - Volume 03
Data Mining: Practical Machine Learning Tools and Techniques, Second Edition (Morgan Kaufmann Series in Data Management Systems)
A Branch and Bound Algorithm for Feature Subset Selection
IEEE Transactions on Computers
A review of feature selection techniques in bioinformatics
Bioinformatics
Modified global k-means algorithm for minimum sum-of-squares clustering problems
Pattern Recognition
ChiMerge: discretization of numeric attributes
AAAI'92 Proceedings of the tenth national conference on Artificial intelligence
The feature selection problem: traditional methods and a new algorithm
AAAI'92 Proceedings of the tenth national conference on Artificial intelligence
Paper: Modeling by shortest data description
Automatica (Journal of IFAC)
Hi-index | 0.00 |
Dimensionality reduction of the problem space through detection and removal of variables, contributing little or not at all to classification, is able to relieve the computational load and instance acquisition effort, considering all the data attributes accessed each time around. The approach to feature selection in this paper is based on the concept of coherent accumulation of data about class centers with respect to coordinates of informative features. Ranking is done on the degree to which different variables exhibit random characteristics. The results are being verified using the Nearest Neighbor classifier. This also helps to address the feature irrelevance and redundancy, what ranking does not immediately decide. Additionally, feature ranking methods from different independent sources are called in for the direct comparison.