Information-based objective functions for active data selection
Neural Computation
Screening, predicting, and computer experiments
Technometrics
The ubiquitous Kronecker product
Journal of Computational and Applied Mathematics - Special issue on numerical analysis 2000 Vol. III: linear algebra
An overview of the Trilinos project
ACM Transactions on Mathematical Software (TOMS) - Special issue on the Advanced CompuTational Software (ACTS) Collection
High-Order Collocation Methods for Differential Equations with Random Inputs
SIAM Journal on Scientific Computing
An adaptive multi-element generalized polynomial chaos method for stochastic differential equations
Journal of Computational Physics
Pattern Recognition and Machine Learning (Information Science and Statistics)
Pattern Recognition and Machine Learning (Information Science and Statistics)
A Stochastic Collocation Method for Elliptic Partial Differential Equations with Random Input Data
SIAM Journal on Numerical Analysis
SIAM Journal on Numerical Analysis
Journal of Computational Physics
On the low-rank approximation by the pivoted Cholesky decomposition
Applied Numerical Mathematics
Cases for the nugget in modeling computer experiments
Statistics and Computing
Multi-output local Gaussian process regression: Applications to uncertainty quantification
Journal of Computational Physics
Hi-index | 31.45 |
Computer codes simulating physical systems usually have responses that consist of a set of distinct outputs (e.g., velocity and pressure) that evolve also in space and time and depend on many unknown input parameters (e.g., physical constants, initial/boundary conditions, etc.). Furthermore, essential engineering procedures such as uncertainty quantification, inverse problems or design are notoriously difficult to carry out mostly due to the limited simulations available. The aim of this work is to introduce a fully Bayesian approach for treating these problems which accounts for the uncertainty induced by the finite number of observations. Our model is built on a multi-dimensional Gaussian process that explicitly treats correlations between distinct output variables as well as space and/or time. The proper use of a separable covariance function enables us to describe the huge covariance matrix as a Kronecker product of smaller matrices leading to efficient algorithms for carrying out inference and predictions. The novelty of this work, is the recognition that the Gaussian process model defines a posterior probability measure on the function space of possible surrogates for the computer code and the derivation of an algorithmic procedure that allows us to sample it efficiently. We demonstrate how the scheme can be used in uncertainty quantification tasks in order to obtain error bars for the statistics of interest that account for the finite number of observations.