Parameter space exploration with Gaussian process trees
ICML '04 Proceedings of the twenty-first international conference on Machine learning
The Journal of Machine Learning Research
A framework for optimization under limited information
Proceedings of the 5th International ICST Conference on Performance Evaluation Methodologies and Tools
Model guided adaptive design and analysis in computer experiment
Statistical Analysis and Data Mining
A framework for optimization under limited information
Journal of Global Optimization
Active learning of intuitive control knobs for synthesizers using gaussian processes
Proceedings of the 19th international conference on Intelligent User Interfaces
Hi-index | 0.00 |
We consider active data selection and test point rejection strategies for Gaussian process regression based on the variance of the posterior over target values. Gaussian process regression is viewed as transductive regression that provides target distributions for given points rather than selecting an explicit regression function. Since not only the posterior mean but also the posterior variance are easily calculated we use this additional information to two ends: Active data selection is performed by either querying at points of high estimated posterior variance or at points that minimize the estimated posterior variance averaged over the input distribution of interest or - in a transductive manner - averaged over the test set. Test point rejection is performed using the estimated posterior variance as a confidence measure. We find for both a two-dimensional toy problem and for a real-world benchmark problem that the variance is a reasonable criterion for both active data selection and test point rejection.