Fuzzy set theory—and its applications (3rd ed.)
Fuzzy set theory—and its applications (3rd ed.)
Extension neural network and its applications
Neural Networks - 2003 Special issue: Advances in neural networks research IJCNN'03
Feature set decomposition for decision trees
Intelligent Data Analysis
Decision-tree instance-space decomposition with grouped gain-ratio
Information Sciences: an International Journal
IEEE/ACM Transactions on Computational Biology and Bioinformatics (TCBB)
Letters: Convex incremental extreme learning machine
Neurocomputing
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics - Special issue on cybernetics and cognitive informatics
Error minimized extreme learning machine with growth of hidden nodes and incremental learning
IEEE Transactions on Neural Networks
Letters: Fully complex extreme learning machine
Neurocomputing
Rapid and brief communication: Evolutionary extreme learning machine
Pattern Recognition
Max-Min Distance Analysis by Using Sequential SDP Relaxation for Dimension Reduction
IEEE Transactions on Pattern Analysis and Machine Intelligence
Extension neural network-type 3
ISNN'05 Proceedings of the Second international conference on Advances in Neural Networks - Volume Part I
IEEE Transactions on Neural Networks
Learning capability and storage capacity of two-hidden-layer feedforward networks
IEEE Transactions on Neural Networks
Extension neural network-type 2 and its applications
IEEE Transactions on Neural Networks
Universal approximation using incremental constructive feedforward networks with random hidden nodes
IEEE Transactions on Neural Networks
Hi-index | 0.01 |
Extreme Learning Machine (ELM), a competitive machine learning technique for single-hidden-layer feedforward neural networks (SLFNNs), is simple in theory and fast in implementation. To deal with high-dimensional data with noise, ELM with a hierarchical structure (HELM) is proposed in this paper. The proposed HELM consists of two parts: some groups of subnets and a main net. The subnets are based on some well-trained auto-associative neural networks (AANNs), which can reduce dimension and filter out noise. The main net is based on the traditional ELM. Additionally, from the perspective of data attributes spaces (DASs), the difficulties in designing subnets are avoided by using a method of Data Attributes Extension Classification (DAEC). Experiments on five high-dimensional datasets with noise are carried out to examine the HELM model. Experimental results show that HELM has higher accuracy with fewer neurons in the main net than ELM.