Discrete affine wavelet transforms for analysis and synthesis of feedforward neural networks
NIPS-3 Proceedings of the 1990 conference on Advances in neural information processing systems 3
Affine Real-Time Face Tracking Using a Wavelet Network
RATFG-RTS '99 Proceedings of the International Workshop on Recognition, Analysis, and Tracking of Faces and Gestures in Real-Time Systems
Letters: Convex incremental extreme learning machine
Neurocomputing
Wavelet neural networks for function learning
IEEE Transactions on Signal Processing
IEEE Transactions on Neural Networks
A comparison of methods for multiclass support vector machines
IEEE Transactions on Neural Networks
Universal approximation using incremental constructive feedforward networks with random hidden nodes
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
Analysis and synthesis of feedforward neural networks using discrete affine wavelet transformations
IEEE Transactions on Neural Networks
Composite Function Wavelet Neural Networks with Differential Evolution and Extreme Learning Machine
Neural Processing Letters
Voting based extreme learning machine
Information Sciences: an International Journal
A new method for expert target recognition system: Genetic wavelet extreme learning machine (GAWELM)
Expert Systems with Applications: An International Journal
Hi-index | 0.01 |
A new structure of wavelet neural networks (WNN) with extreme learning machine (ELM) is introduced in this paper. In the proposed wavelet neural networks, composite functions are applied at the hidden nodes and the learning is done using ELM. The input information is first processed by wavelet functions and then passed through a type of bounded nonconstant piecewise continuous activation functions g:R-R. A selection method that takes into account the domain of input space where the wavelets are not zero is used to initialize the translation and dilation parameters. The formed wavelet neural network is then trained with the computationally efficient ELM algorithm. Experimental results on the regression of some nonlinear functions and real-world data, the prediction of a chaotic signal and classifications on serval benchmark real-world data sets show that the proposed neural networks can achieve better performances in most cases than some relevant neural networks and learn much faster than neural networks training with the traditional back-propagation (BP) algorithm.