Elements of information theory
Elements of information theory
Input Feature Selection by Mutual Information Based on Parzen Window
IEEE Transactions on Pattern Analysis and Machine Intelligence
Hybrid Genetic Algorithms for Feature Selection
IEEE Transactions on Pattern Analysis and Machine Intelligence
Pattern Recognition and Neural Networks
Pattern Recognition and Neural Networks
A new pruning heuristic based on variance analysis of sensitivity information
IEEE Transactions on Neural Networks
A Novel Pruning Algorithm for Optimizing Feedforward Neural Network of Classification Problems
Neural Processing Letters
Reverse Engineering the Neural Networks for Rule Extraction in Classification Problems
Neural Processing Letters
Hi-index | 0.00 |
This brief presents a two-phase construction approach for pruning both input and hidden units of multilayer perceptrons (MLPs) based on mutual information (MI). First, all features of input vectors are ranked according to their relevance to target outputs through a forward strategy. The salient input units of an MLP are thus determined according to the order of the ranking result and by considering their contributions to the network's performance. Then, the irrelevant features of input vectors can be identified and eliminated. Second, the redundant hidden units are removed from the trained MLP one after another according to a novel relevance measure. Compared with its related work, the proposed strategy exhibits better performance. Moreover, experimental results show that the proposed method is comparable or even superior to support vector machine (SVM) and support vector regression (SVR). Finally, the advantages of the MI-based method are investigated in comparison with the sensitivity analysis (SA)-based method.