Nonlinear statistical models
Optimization by simulated annealing
Readings in computer vision: issues, problems, principles, and paradigms
A new interpretation of schema notation that overturns the binary encoding constraint
Proceedings of the third international conference on Genetic algorithms
Adaptation in natural and artificial systems
Adaptation in natural and artificial systems
Genetic algorithms + data structures = evolution programs (2nd, extended ed.)
Genetic algorithms + data structures = evolution programs (2nd, extended ed.)
Tackling Real-Coded Genetic Algorithms: Operators and Tools for Behavioural Analysis
Artificial Intelligence Review
Genetic Algorithms in Search, Optimization and Machine Learning
Genetic Algorithms in Search, Optimization and Machine Learning
Simulation and the Monte Carlo Method
Simulation and the Monte Carlo Method
System Identification Using Genetic Algorithms
PPSN I Proceedings of the 1st Workshop on Parallel Problem Solving from Nature
Numerical Methods for Unconstrained Optimization and Nonlinear Equations (Classics in Applied Mathematics, 16)
Improving crossover operator for real-coded genetic algorithms using virtual parents
Journal of Heuristics
CIXL2: a crossover operator for evolutionary algorithms based on population features
Journal of Artificial Intelligence Research
Hi-index | 0.01 |
Genetic algorithms are optimization techniques especially useful in functions whose nonlinearity makes an analytical optimization impossible. This kind of functions appear when using least squares estimators in nonlinear regression problems. Least squares optimizers in general, and the Levenberg-Marquardt method in particular, are iterative methods especially designed to solve this kind of problems, but the results depend on both the features of the problem and the closeness to the optimum of the starting point. In this paper we study the least squares estimator and the optimization methods that are based on it. Then we analyze those features of real-coded genetic algorithms that can be useful in the context of nonlinear regression. Special attention will be devoted to the crossover operator, and a new operator based on confidence intervals will be proposed. This crossover provides an equilibrium between exploration and exploitation of the search space, which is very adequate for this kind of problems. To analyze the fitness and robustness of the proposed crossover operator, we will use three complex nonlinear regression problems with search domains of different amplitudes and compare its performance with that of other crossover operators and with the Levenberg-Marquardt method using a multi-start scheme.