A mathematical theory of communication
ACM SIGMOBILE Mobile Computing and Communications Review
Input Feature Selection by Mutual Information Based on Parzen Window
IEEE Transactions on Pattern Analysis and Machine Intelligence
Cluster Analysis for Gene Expression Data: A Survey
IEEE Transactions on Knowledge and Data Engineering
IEEE Transactions on Pattern Analysis and Machine Intelligence
Toward Robust Distance Metric Analysis for Similarity Estimation
CVPR '06 Proceedings of the 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition - Volume 1
Pattern Recognition and Machine Learning (Information Science and Statistics)
Pattern Recognition and Machine Learning (Information Science and Statistics)
Feature Extraction: Foundations and Applications (Studies in Fuzziness and Soft Computing)
Feature Extraction: Foundations and Applications (Studies in Fuzziness and Soft Computing)
Conditional Mutual Information Based Feature Selection
KAM '08 Proceedings of the 2008 International Symposium on Knowledge Acquisition and Modeling
Gait feature subset selection by mutual information
IEEE Transactions on Systems, Man, and Cybernetics, Part A: Systems and Humans - Special section: Best papers from the 2007 biometrics: Theory, applications, and systems (BTAS 07) conference
A new graph-theoretic approach to clustering and segmentation
CVPR'03 Proceedings of the 2003 IEEE computer society conference on Computer vision and pattern recognition
A graph-based approach to feature selection
GbRPR'11 Proceedings of the 8th international conference on Graph-based representations in pattern recognition
Using mutual information for selecting features in supervised neural net learning
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
In many data analysis tasks, one is often confronted with very high dimensional data. The feature selection problem is essentially a combinatorial optimization problem which is computationally expensive. To overcome this problem it is frequently assumed either that features independently influence the class variable or do so only involving pairwise feature interaction. In prior work [18], we have explained the use of a new measure called multidimensional interaction information (MII) for feature selection. The advantage of MII is that it can consider third or higher order feature interaction. Using dominant set clustering, we can extract most of the informative features in the leading dominant sets in advance, limiting the search space for higher order interactions. In this paper, we provide a comparison of different similarity measures based on mutual information. Experimental results demonstrate the effectiveness of our feature selection method on a number of standard data-sets.