Biases in the crossover landscape
Proceedings of the third international conference on Genetic algorithms
Adaptation in natural and artificial systems
Adaptation in natural and artificial systems
Genetic Algorithms in Search, Optimization and Machine Learning
Genetic Algorithms in Search, Optimization and Machine Learning
Neural Networks: A Comprehensive Foundation
Neural Networks: A Comprehensive Foundation
A genetic algorithm for designing distributed computer networktopologies
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Tuning of a neuro-fuzzy controller by genetic algorithm
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Extending the functional equivalence of radial basis function networks and fuzzy inference systems
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
A new evolutionary system for evolving artificial neural networks
IEEE Transactions on Neural Networks
An Iterative Method for Deciding SVM and Single Layer Neural Network Structures
Neural Processing Letters
A novel approach for high dimension 3D object representation using Multi-Mother Wavelet Network
Multimedia Tools and Applications
Electrocardiogram Signal Compression Using Beta Wavelets
Journal of Mathematical Modelling and Algorithms
Beta wavelet based ECG signal compression using lossless encoding with modified thresholding
Computers and Electrical Engineering
Hi-index | 0.00 |
We propose two evolutionary neural network-training algorithms for Beta basis function neural networks (BBFNN). Classic training algorithms for neural networks start with a predetermined network structure, and so the quality of the response of the BBFNN depends strongly on its structure. Generally, the network resulting from learning applied to a predetermined architecture is either insufficient or overcomplicated.This paper describes two genetic learning models of the BBFNN. The first continuous genetic model changes the number of neurons in the hidden layer through the application of specific genetic operators. Each network is coded as a variable length string and some new genetic operators are proposed to evolve a population of individuals. A function is proposed to evaluate the fitness of individual networks. Applications to function approximation problems are considered to demonstrate the performance of the BBFNN and of the evolutionary algorithm. For the second discrete genetic model, each network is coded as a matrix for which number of rows is equal to the number of parameters in the function that will be approximated. The genetic algorithm operators change the number of neurons in the hidden layer. Some applications to functions with one and two parameters are considered to demonstrate the performance of the genetic model and the ability of genetic algorithm to be used for the design of BBFNN.