On the Convergence of Pattern Search Algorithms
SIAM Journal on Optimization
Pattern Search Algorithms for Bound Constrained Minimization
SIAM Journal on Optimization
Analysis of Generalized Pattern Searches
SIAM Journal on Optimization
A Hybrid Training Algorithm for Feedforward Neural Networks
Neural Processing Letters
Nonsmooth optimization through Mesh Adaptive Direct Search and Variable Neighborhood Search
Journal of Global Optimization
Pattern classification with class probability output network
IEEE Transactions on Neural Networks
Novel maximum-margin training algorithms for supervised neural networks
IEEE Transactions on Neural Networks
Fast training of multilayer perceptrons
IEEE Transactions on Neural Networks
Tuning of the structure and parameters of a neural network using an improved genetic algorithm
IEEE Transactions on Neural Networks
Tuning the structure and parameters of a neural network by using hybrid Taguchi-genetic algorithm
IEEE Transactions on Neural Networks
An Optimization Methodology for Neural Network Weights and Architectures
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
This paper proposes a novel unsupervised training method, based on direct search optimization technique, which could be successfully employed in the finding the optimal free parameters, e.g. weights and biases, of an artificial neural network (ANN). Benchmark data sets of artificial and real-world problems have been used in experiments that enable a comparison with other optimization methods e.g. genetic algorithm and state-of-the-art classifiers. The results provide evidence of the effectiveness of our method regarding the possibility of finding the optimal values of weights and biases of a multilayer perceptron neural network and constructing an ANN autonomously.