Tackling Real-Coded Genetic Algorithms: Operators and Tools for Behavioural Analysis
Artificial Intelligence Review
Comparative evaluation of genetic algorithm and backpropagation for training neural networks
Information Sciences—Informatics and Computer Science: An International Journal
IEEE Transactions on Pattern Analysis and Machine Intelligence
Neural Computing and Applications
Training feedforward neural networks using genetic algorithms
IJCAI'89 Proceedings of the 11th international joint conference on Artificial intelligence - Volume 1
Neural Networks Initial Weights Optimisation
UKSIM '10 Proceedings of the 2010 12th International Conference on Computer Modelling and Simulation
Structural risk minimization over data-dependent hierarchies
IEEE Transactions on Information Theory
Stability problems with artificial neural networks and the ensemble solution
Artificial Intelligence in Medicine
Hi-index | 0.00 |
This paper aims to find the optimal set of initial weights to enhance the accuracy of artificial neural networks (ANNs) by using genetic algorithms (GA). The sample in this study included 228 patients with first low-trauma hip fracture and 215 patients without hip fracture, both of them were interviewed with 78 questions. We used logistic regression to select 5 important factors (i.e., bone mineral density, experience of fracture, average hand grip strength, intake of coffee, and peak expiratory flow rate) for building artificial neural networks to predict the probabilities of hip fractures. Three-layer (one hidden layer) ANNs models with back-propagation training algorithms were adopted. The purpose in this paper is to find the optimal initial weights of neural networks via genetic algorithm to improve the predictability. Area under the ROC curve (AUC) was used to assess the performance of neural networks. The study results showed the genetic algorithm obtained an AUC of 0.858 ± 0.00493 on modeling data and 0.802±0.03318 on testing data. They were slightly better than the results of our previous study (0.868±0.00387 and 0.796±0.02559, resp.). Thus, the preliminary study for only using simple GA has been proved to be effective for improving the accuracy of artificial neural networks.