IEEE Transactions on Pattern Analysis and Machine Intelligence
Letters: Convex incremental extreme learning machine
Neurocomputing
Feature Selection with Kernel Class Separability
IEEE Transactions on Pattern Analysis and Machine Intelligence
Feature selection with dynamic mutual information
Pattern Recognition
Universal approximation using incremental constructive feedforward networks with random hidden nodes
IEEE Transactions on Neural Networks
Using mutual information for selecting features in supervised neural net learning
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
Effective modeling based on the high dimensional data needs feature selection and fast learning speed. Aim at this problem, a novel modeling approach based on mutual information and extreme learning machines is proposed in this paper. Simple mutual information based feature selection method is integrated with the fast learning kernel based extreme learning machines to obtain better modeling performance. In the method, optimal number of the features and learning parameters of models are selected simultaneously. The simulation results based on the near-infrared spectrum show that the proposed approach has better prediction performance and fast leaning speed.