Advances in neural information processing systems 2
A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
Wrappers for feature subset selection
Artificial Intelligence - Special issue on relevance
Bayesian Learning for Neural Networks
Bayesian Learning for Neural Networks
An introduction to variable and feature selection
The Journal of Machine Learning Research
Overfitting in making comparisons between variable selection methods
The Journal of Machine Learning Research
Feature extraction by non parametric mutual information maximization
The Journal of Machine Learning Research
IEEE Transactions on Pattern Analysis and Machine Intelligence
An efficient search strategy for feature selection using Chow-Liu trees
ICANN'07 Proceedings of the 17th international conference on Artificial neural networks
On estimating mutual information for feature selection
ICANN'10 Proceedings of the 20th international conference on Artificial neural networks: Part I
ICANN'06 Proceedings of the 16th international conference on Artificial Neural Networks - Volume Part I
Hi-index | 0.00 |
In this paper, we apply weighted Mutual Information for effective feature selection. The presented hybrid filter wrapper approach resembles the well known AdaBoost algorithm by focusing on those samples that are not classified or approximated correctly using the selected features. Redundancies and bias of the employed learning machine are handled implicitly by our approach. In experiments, we compare the weighted Mutual Information algorithm with other basic approaches for feature subset selection that use similar selection criteria. The efficiency and effectiveness of our method are demonstrated by the obtained results.