Radial basis functions for multivariable interpolation: a review
Algorithms for approximation
Applied multivariate statistical analysis
Applied multivariate statistical analysis
Genetic programming: on the programming of computers by means of natural selection
Genetic programming: on the programming of computers by means of natural selection
Optimization with genetic algorithm hybrids that use local searches
Adaptive individuals in evolving populations
The Racing Algorithm: Model Selection for Lazy Learners
Artificial Intelligence Review - Special issue on lazy learning
Wrappers for feature subset selection
Artificial Intelligence - Special issue on relevance
Mining high-speed data streams
Proceedings of the sixth ACM SIGKDD international conference on Knowledge discovery and data mining
The degree sequence of a scale-free random graph process
Random Structures & Algorithms
Evolution strategies in noisy environments- a survey of existing work
Theoretical aspects of evolutionary computing
Mining time-changing data streams
Proceedings of the seventh ACM SIGKDD international conference on Knowledge discovery and data mining
Learning from Data: Concepts, Theory, and Methods
Learning from Data: Concepts, Theory, and Methods
A Representation for the Adaptive Generation of Simple Sequential Programs
Proceedings of the 1st International Conference on Genetic Algorithms
Accelerating the Convergence of Evolutionary Algorithms by Fitness Landscape Approximation
PPSN V Proceedings of the 5th International Conference on Parallel Problem Solving from Nature
Optimization of Noisy Fitness Functions by Means of Genetic Algorithms Using History of Search
PPSN VI Proceedings of the 6th International Conference on Parallel Problem Solving from Nature
A Racing Algorithm for Configuring Metaheuristics
GECCO '02 Proceedings of the Genetic and Evolutionary Computation Conference
ICANN '97 Proceedings of the 7th International Conference on Artificial Neural Networks
Adaptive Genetic Programming Applied to New and Existing Simple Regression Problems
EuroGP '01 Proceedings of the 4th European Conference on Genetic Programming
A comprehensive survey of fitness approximation in evolutionary computation
Soft Computing - A Fusion of Foundations, Methodologies and Applications
Constrained Global Optimization of Expensive Black Box Functions Using Radial Basis Functions
Journal of Global Optimization
CEC '02 Proceedings of the Evolutionary Computation on 2002. CEC '02. Proceedings of the 2002 Congress - Volume 02
IEEE Transactions on Evolutionary Computation
A framework for evolutionary optimization with approximate fitnessfunctions
IEEE Transactions on Evolutionary Computation
Local function approximation in evolutionary algorithms for the optimization of costly functions
IEEE Transactions on Evolutionary Computation
Efficient search for robust solutions by means of evolutionary algorithms and fitness approximation
IEEE Transactions on Evolutionary Computation
Coevolution of Fitness Predictors
IEEE Transactions on Evolutionary Computation
Hi-index | 0.00 |
In symbolic regression area, it is difficult for evolutionary algorithms to construct a regression model when the number of sample points is very large. Much time will be spent in calculating the fitness of the individuals and in selecting the best individuals within the population. Hoeffding bound is a probability bound for sums of independent random variables. As a statistical result, it can be used to exactly decide how many samples are necessary for choosing i individuals from a population in evolutionary algorithms without calculating the fitness completely. This paper presents a Hoeffding bound based evolutionary algorithm (HEA) for regression or approximation problems when the number of the given learning samples is very large. In HEA, the original fitness function is used in every k generations to update the approximate fitness obtained by Hoeffding bound. The parameter 1-@d is the probability of correctly selecting i best individuals from population P, which can be tuned to avoid an unstable evolution process caused by a large discrepancy between the approximate model and the original fitness function. The major advantage of the proposed HEA algorithm is that it can guarantee that the solution discovered has performance matching what would be discovered with a traditional genetic programming (GP) selection operator with a determinate probability and the running time can be reduced largely. We examine the performance of the proposed algorithm with several regression problems and the results indicate that with the similar accuracy, the HEA algorithm can find the solution more efficiently than tradition EA. It is very useful for regression problems with large number of training samples.