Practical neural network recipes in C++
Practical neural network recipes in C++
Advanced algorithms for neural networks: a C++ sourcebook
Advanced algorithms for neural networks: a C++ sourcebook
Artificial intelligence: a new synthesis
Artificial intelligence: a new synthesis
Numerical Recipes in C++: the art of scientific computing
Numerical Recipes in C++: the art of scientific computing
Neural Smithing: Supervised Learning in Feedforward Artificial Neural Networks
Neural Smithing: Supervised Learning in Feedforward Artificial Neural Networks
Artificial Intelligence: A Modern Approach
Artificial Intelligence: A Modern Approach
Ai Application Programming (Charles River Media Programming)
Ai Application Programming (Charles River Media Programming)
Analysis of a variable speed vapor compression system using artificial neural networks
Expert Systems with Applications: An International Journal
Hi-index | 0.00 |
Artificial neural networks are used to solve problems that are difficult for humans and computers. Unfortunately, artificial neural network training is time consuming and, because it is a random process, several cold starts are recommended. Neural network training is typically a two step process. First, the network's weights are initialized using a no greedy method to elude local minima. Second, an optimization method (i.e., conjugate gradient learning) is used to quickly find the nearest local minimum. In general, training must be performed to reduce the mean square error computed between the desired output and the actual network output. One common method for network initialization is simulated annealing; it is used to assign good starting values to the network's weights before performing the optimization. The performance of simulated annealing depends strongly on the cooling process. A cooling schedule based on temperature cycling is proposed to improve artificial neural network training. It is shown that temperature cycling reduces training time while decreasing the mean square error on autoassociative neural networks. Three auto-associative problems: The Trifolium, The Cardioid, and The Lemniscate of Bernoulli, are solved using exponential cooling, linear cooling and temperature cycling to verify our results.