The Strength of Weak Learnability
Machine Learning
Approximation capabilities of multilayer feedforward networks
Neural Networks
Improving Regressors using Boosting Techniques
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
Accelerating the Convergence of Evolutionary Algorithms by Fitness Landscape Approximation
PPSN V Proceedings of the 5th International Conference on Parallel Problem Solving from Nature
Genetic Algorithms: Principles and Perspectives: A Guide to GA Theory
Genetic Algorithms: Principles and Perspectives: A Guide to GA Theory
Experiments with AdaBoost.RT, an improved boosting scheme for regression
Neural Computation
Foundations of Global Genetic Optimization
Foundations of Global Genetic Optimization
Combinatorial Development of Solid Catalytic Materials: Design of High-Throughput Experiments, Data Analysis, Data Mining
Combining Global and Local Surrogate Models to Accelerate Evolutionary Optimization
IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews
Evolutionary optimization of catalysts assisted by neural-network learning
SEAL'10 Proceedings of the 8th international conference on Simulated evolution and learning
Hi-index | 0.00 |
The paper deals with a neural-network-based version of surrogate modelling, a modern approach to the optimization of empirical objective functions. The approach leads to a substantial decrease of time and costs of evaluation of the objective function, a property that is particularly attractive in evolutionary optimization. In the paper, an extension of surrogate modelling with regression boosting is proposed, which increases the accuracy of surrogate models, thus also the agreement between results obtained with the model and those obtained with the original objective function. The extension is illustrated on a case study in materials science. Presented case study results clearly confirm the usefulness of boosting for neural-network-based surrogate models.