Robust regression and outlier detection
Robust regression and outlier detection
Integer and combinatorial optimization
Integer and combinatorial optimization
Introduction to statistical pattern recognition (2nd ed.)
Introduction to statistical pattern recognition (2nd ed.)
Instance-Based Learning Algorithms
Machine Learning
Elements of information theory
Elements of information theory
Learning Boolean concepts in the presence of many irrelevant features
Artificial Intelligence
Independent component analysis, a new concept?
Signal Processing - Special issue on higher order statistics
Selection of relevant features and examples in machine learning
Artificial Intelligence - Special issue on relevance
Wrappers for feature subset selection
Artificial Intelligence - Special issue on relevance
The Earth Mover's Distance as a Metric for Image Retrieval
International Journal of Computer Vision
Unsupervised Feature Selection Using Feature Similarity
IEEE Transactions on Pattern Analysis and Machine Intelligence
Input Feature Selection by Mutual Information Based on Parzen Window
IEEE Transactions on Pattern Analysis and Machine Intelligence
Machine Learning
Robust Active Shape Model Search
ECCV '02 Proceedings of the 7th European Conference on Computer Vision-Part IV
Mutual Information in Learning Feature Transformations
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Feature Selection for Clustering - A Filter Solution
ICDM '02 Proceedings of the 2002 IEEE International Conference on Data Mining
Theoretical and Empirical Analysis of ReliefF and RReliefF
Machine Learning
An introduction to variable and feature selection
The Journal of Machine Learning Research
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
Feature selection with conditional mutual information maximin in text categorization
Proceedings of the thirteenth ACM international conference on Information and knowledge management
Efficient Feature Selection via Analysis of Relevance and Redundancy
The Journal of Machine Learning Research
Fast Binary Feature Selection with Conditional Mutual Information
The Journal of Machine Learning Research
Feature selection and feature extraction for text categorization
HLT '91 Proceedings of the workshop on Speech and Natural Language
A New Dependency and Correlation Analysis for Features
IEEE Transactions on Knowledge and Data Engineering
A new mutual information based measure for feature selection
Intelligent Data Analysis
Feature selection with dynamic mutual information
Pattern Recognition
Robust feature extraction via information theoretic learning
ICML '09 Proceedings of the 26th Annual International Conference on Machine Learning
Feature subset selection in large dimensionality domains
Pattern Recognition
Variational Graph Embedding for Globally and Locally Consistent Feature Extraction
ECML PKDD '09 Proceedings of the European Conference on Machine Learning and Knowledge Discovery in Databases: Part II
Effective feature selection scheme using mutual information
Neurocomputing
Correntropy: Properties and Applications in Non-Gaussian Signal Processing
IEEE Transactions on Signal Processing
Generalized correlation function: definition, properties, and application to blind equalization
IEEE Transactions on Signal Processing - Part I
Input feature selection for classification problems
IEEE Transactions on Neural Networks
Efficient and robust feature extraction by maximum margin criterion
IEEE Transactions on Neural Networks
Using mutual information for selecting features in supervised neural net learning
IEEE Transactions on Neural Networks
MICCAI'11 Proceedings of the 2011 international conference on Prostate cancer imaging: image analysis and image-guided interventions
Hi-index | 0.01 |
Most feature selection algorithms based on information-theoretic learning (ITL) adopt ranking process or greedy search as their searching strategies. The former selects features individually so that it ignores feature interaction and dependencies. The latter heavily relies on the search paths, as only one path will be explored with no possible back-track. In addition, both strategies typically lead to heuristic algorithms. To cope with these problems, this article proposes a novel feature selection framework based on correntropy in ITL, namely correntropy based feature selection using binary projection (BPFS). Our framework selects features by projecting the original high-dimensional data to a low-dimensional space through a special binary projection matrix. The formulated objective function aims at maximizing the correntropy between selected features and class labels. And this function can be efficiently optimized via standard mathematical tools. We apply the half-quadratic method to optimize the objective function in an iterative manner, where each iteration reduces to an assignment subproblem which can be highly efficiently solved with some off-the-shelf toolboxes. Comparative experiments on six real-world datasets indicate that our framework is effective and efficient.