A practical approach to feature selection
ML92 Proceedings of the ninth international workshop on Machine learning
C4.5: programs for machine learning
C4.5: programs for machine learning
Feature Selection: Evaluation, Application, and Small Sample Performance
IEEE Transactions on Pattern Analysis and Machine Intelligence
Wrappers for feature subset selection
Artificial Intelligence - Special issue on relevance
An introduction to variable and feature selection
The Journal of Machine Learning Research
Object Recognition with Informative Features and Linear Classification
ICCV '03 Proceedings of the Ninth IEEE International Conference on Computer Vision - Volume 2
Consistency-based search in feature selection
Artificial Intelligence
Efficient Feature Selection via Analysis of Relevance and Redundancy
The Journal of Machine Learning Research
Fast Binary Feature Selection with Conditional Mutual Information
The Journal of Machine Learning Research
IEEE Transactions on Pattern Analysis and Machine Intelligence
Data Mining: Practical Machine Learning Tools and Techniques, Second Edition (Morgan Kaufmann Series in Data Management Systems)
Feature selection with dynamic mutual information
Pattern Recognition
Review: Intrusion detection by machine learning: A review
Expert Systems with Applications: An International Journal
The use of artificial intelligence based techniques for intrusion detection: a review
Artificial Intelligence Review
Conditional infomax learning: an integrated framework for feature extraction and fusion
ECCV'06 Proceedings of the 9th European conference on Computer Vision - Volume Part I
Input feature selection for classification problems
IEEE Transactions on Neural Networks
Using mutual information for selecting features in supervised neural net learning
IEEE Transactions on Neural Networks
The use of artificial-intelligence-based ensembles for intrusion detection: a review
Applied Computational Intelligence and Soft Computing
Hi-index | 0.00 |
Feature selection methods play a significant role during classification of data having high dimensions of features. The methods select most relevant subset of features that describe data appropriately. Mutual information (MI) based upon information theory is one of the metrics used for measuring relevance of features. This paper analyses various feature selection methods for (1) reduction in number of features; (2) performance of Naïve Bayes classification model trained on reduced set of features. Research gaps identified are: (1) computation of MI from the whole sample space instead of unclassified sample subspace; (2) consideration of relevance of features only or tradeoff between relevance and redundancy, but class conditional interaction of features is ignored. In this paper, we propose a general evaluation function using MI for feature selection. The proposed evaluation function is implemented which use dynamically computed MI values from unclassified instances. Effectiveness of the proposed feature selection method is done empirically by comparing classification results using KDD 1999 benchmarked dataset of intrusion detection. The results indicate practicability and effectiveness of the proposed method for applications concerned with high accuracy and stability of predictions. Copyright © 2011 John Wiley & Sons, Ltd.