Information-based objective functions for active data selection
Neural Computation
Probabilistic robustness analysis: explicit bounds for the minimum number of samples
Systems & Control Letters
An introduction to Kolmogorov complexity and its applications (2nd ed.)
An introduction to Kolmogorov complexity and its applications (2nd ed.)
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
A Taxonomy of Global Optimization Methods Based on Response Surfaces
Journal of Global Optimization
On the optimal search problem: the case when the target distribution is unknown
SCCC '97 Proceedings of the 17th International Conference of the Chilean Computer Science Society
Gaussian Process Regression: Active Data Selection and Test Point Rejection
IJCNN '00 Proceedings of the IEEE-INNS-ENNS International Joint Conference on Neural Networks (IJCNN'00)-Volume 3 - Volume 3
Information Theory, Inference & Learning Algorithms
Information Theory, Inference & Learning Algorithms
Global Optimization of Stochastic Black-Box Systems via Sequential Kriging Meta-Models
Journal of Global Optimization
Gaussian Processes for Machine Learning (Adaptive Computation and Machine Learning)
Gaussian Processes for Machine Learning (Adaptive Computation and Machine Learning)
Pattern Recognition and Machine Learning (Information Science and Statistics)
Pattern Recognition and Machine Learning (Information Science and Statistics)
Randomized Algorithms for Analysis and Control of Uncertain Systems: With Applications
Randomized Algorithms for Analysis and Control of Uncertain Systems: With Applications
Lies, damn lies and preferences: a Gaussian process model for ubiquitous thermal preference trials
BuildSys '12 Proceedings of the Fourth ACM Workshop on Embedded Sensing Systems for Energy-Efficiency in Buildings
Hi-index | 0.00 |
In many real world problems, optimization decisions have to be made often with limited information. The decision maker may have no a priori data about the (nonconvex) objective function except from on a limited number of points that are obtained over time through costly observations. This paper presents an optimization framework that takes into account the information collection (observation), estimation (regression), and optimization (maximization) aspects in a holistic and structured manner. Explicitly quantifying the information acquired at each optimization step using the entropy measure from information theory, the objective function to be optimized is modeled and estimated by adopting a Bayesian approach, specifically using Gaussian processes as a state-of-the-art regression method. The resulting iterative scheme allows the decision maker to solve the problem by expressing preferences for each aspect quantitatively and concurrently.