Ten lectures on wavelets
An introduction to wavelets
Differential Evolution Training Algorithm for Feed-Forward Neural Networks
Neural Processing Letters
Application of Bayesian trained RBF networks to nonlinear time-series modeling
Signal Processing - From signal processing theory to implementation
Complex-valued wavelet network
Journal of Computer and System Sciences
An improved Akaike information criterion for state-space model selection
Computational Statistics & Data Analysis
Learning hybrid Bayesian networks using mixtures of truncated exponentials
International Journal of Approximate Reasoning
A novel learning algorithm for wavelet neural networks
ICNC'05 Proceedings of the First international conference on Advances in Natural Computation - Volume Part I
A constructive algorithm for wavelet neural networks
ICNC'05 Proceedings of the First international conference on Advances in Natural Computation - Volume Part I
Wavelet neural networks for function learning
IEEE Transactions on Signal Processing
Using wavelet network in nonparametric estimation
IEEE Transactions on Neural Networks
Subspace information criterion for nonquadratic regularizers-Model selection for sparse regressors
IEEE Transactions on Neural Networks
Robust support vector regression networks for function approximation with outliers
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
Non uniform noisy data training using wavelet neural network based on sampling theory
WSEAS TRANSACTIONS on SYSTEMS
Wavelet neural networks: A practical guide
Neural Networks
Hi-index | 0.01 |
To avoid overfitting, the sampling theory is applied to training of wavelet networks. A new algorithm is proposed based on the limited band of wavelet networks, in which the input weights are decided by the sampling period or the frequency band of target function instead of sample errors. The wavelet networks trained by our new algorithm have global convergence, avoidance of local minimum and ability to approximate band-limited functions. Our new algorithm is also extended to learn from noisy data. The theorems prove that the wavelet network trained by our new algorithm is just an ideal low-pass filter, which removes the high-frequency noise in training data. In the simulation, we compare the performance of new algorithm with that of regularization technology. The results show that our algorithm is more robust to the variance of noise and removes high-frequency noise more effectively.