Journal of Computational Physics
CVODE, a stiff/nonstiff ODE solver in C
Computers in Physics
Nelder-Mead simplex modifications for simulation optimization
Management Science
Implementing Clenshaw-Curtis quadrature, I methodology and experience
Communications of the ACM
Implementing Clenshaw-Curtis quadrature, II computing the cosine transformation
Communications of the ACM
Pracniques: further remarks on reducing truncation errors
Communications of the ACM
The Wiener--Askey Polynomial Chaos for Stochastic Differential Equations
SIAM Journal on Scientific Computing
Introduction to Stochastic Search and Optimization
Introduction to Stochastic Search and Optimization
Information Theory, Inference & Learning Algorithms
Information Theory, Inference & Learning Algorithms
Numerical Challenges in the Use of Polynomial Chaos Representations for Stochastic Processes
SIAM Journal on Scientific Computing
Monte Carlo Statistical Methods (Springer Texts in Statistics)
Monte Carlo Statistical Methods (Springer Texts in Statistics)
Statistics and Computing
Stochastic spectral methods for efficient Bayesian solution of inverse problems
Journal of Computational Physics
Is Gauss Quadrature Better than Clenshaw-Curtis?
SIAM Review
Model Reduction for Large-Scale Systems with High-Dimensional Parametric Input Space
SIAM Journal on Scientific Computing
Dimensionality reduction and polynomial chaos acceleration of Bayesian inference in inverse problems
Journal of Computational Physics
Computational Statistics & Data Analysis
Hi-index | 31.45 |
The optimal selection of experimental conditions is essential to maximizing the value of data for inference and prediction, particularly in situations where experiments are time-consuming and expensive to conduct. We propose a general mathematical framework and an algorithmic approach for optimal experimental design with nonlinear simulation-based models; in particular, we focus on finding sets of experiments that provide the most information about targeted sets of parameters. Our framework employs a Bayesian statistical setting, which provides a foundation for inference from noisy, indirect, and incomplete data, and a natural mechanism for incorporating heterogeneous sources of information. An objective function is constructed from information theoretic measures, reflecting expected information gain from proposed combinations of experiments. Polynomial chaos approximations and a two-stage Monte Carlo sampling method are used to evaluate the expected information gain. Stochastic approximation algorithms are then used to make optimization feasible in computationally intensive and high-dimensional settings. These algorithms are demonstrated on model problems and on nonlinear parameter inference problems arising in detailed combustion kinetics.