Communications of the ACM
Evidential reasoning using stochastic simulation of causal models
Artificial Intelligence
Adaptation in natural and artificial systems
Adaptation in natural and artificial systems
Efficient distribution-free learning of probabilistic concepts
Journal of Computer and System Sciences - Special issue: 31st IEEE conference on foundations of computer science, Oct. 22–24, 1990
Logic and discrete mathematics: a computer science perspective
Logic and discrete mathematics: a computer science perspective
What makes an optimization problem hard?
Complexity
Fat-shattering and the learnability of real-valued functions
Journal of Computer and System Sciences
An Introduction to Variational Methods for Graphical Models
Machine Learning
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
Genetic Algorithms in Search, Optimization and Machine Learning
Genetic Algorithms in Search, Optimization and Machine Learning
Machine Learning
Estimation of Distribution Algorithms: A New Tool for Evolutionary Computation
Estimation of Distribution Algorithms: A New Tool for Evolutionary Computation
A Survey of Optimization by Building and Using Probabilistic Models
Computational Optimization and Applications
Efficient Global Optimization of Expensive Black-Box Functions
Journal of Global Optimization
Metamodel-Assisted Evolution Strategies
PPSN VII Proceedings of the 7th International Conference on Parallel Problem Solving from Nature
Artificial Intelligence: A Modern Approach
Artificial Intelligence: A Modern Approach
Population-Based Incremental Learning: A Method for Integrating Genetic Search Based Function Optimization and Competitive Learning
Bayesian optimization models for particle swarms
GECCO '05 Proceedings of the 7th annual conference on Genetic and evolutionary computation
Sequential Monte Carlo Methods to Train Neural Network Models
Neural Computation
No free lunch theorems for optimization
IEEE Transactions on Evolutionary Computation
Hi-index | 0.00 |
We present a statistical model of empirical optimization that admits the creation of algorithms with explicit and intuitively defined desiderata. Because No Free Lunch theorems dictate that no optimization algorithm can be considered more efficient than any other when considering all possible functions, the desired function class plays a prominent role in the model. In particular, this provides a direct way to answer the traditionally difficult question of what algorithm is best matched to a particular class of functions. Among the benefits of the model are the ability to specify the function class in a straightforward manner, a natural way to specify noisy or dynamic functions, and a new source of insight into No Free Lunch theorems for optimization.