Research Interests
Design and analysis of computer experiments
- H. Dette, A. Pepelyshev (2010)
Generalized Latin hypercube design for computer experiments. Technometrics, in press.link
Abstract
Space filling designs, which satisfy a uniformity property, are widely used in computer experiments. In the present paper, the performance of non-uniform experimental designs, which locate more points in a neighborhood of the boundary of the design space, is investigated. These designs are obtained by a quantile transformation of the one-dimensional projections of commonly used space filling designs. This transformation is motivated by logarithmic potential theory, which yields the arc-sine measure as an equilibrium distribution. The methodology is illustrated for maximin Latin hypercube designs by several examples. In particular, it is demonstrated that the new designs yield a smaller integrated mean squared error for prediction. - H. Dette, A. Pepelyshev (2010)
NPUA: A new approach for the analysis of computer experiments. Chemometrics and Intelligent Laboratory Systems 104, 333–340. link
Abstract
An important problem in the analysis of computer experiments is the specification of the uncertainty of the prediction according to a meta-model. The Bayesian approach, developed for the uncertainty analysis of deterministic computer models, expresses uncertainty by the use of a Gaussian process. There are several versions of the Bayesian approach, which are different in many regards but all of them lead to time consuming computations for large data sets. In the present paper we introduce a new approach in which the distribution of uncertainty is obtained in a general nonparametric form. The proposed approach is called non-parametric uncertainty analysis (NPUA), which is computationally simple since it combines generic sampling and regression techniques. We compare NPUA with the Bayesian and Kriging approaches and show the advantages of NPUA for finding points for the next runs by reanalyzing the ASET model. - A. Pepelyshev (2010)
The role of the nugget term in the Gaussian process method. accepted, MODA 9---Advances in model-oriented design and analysis, Contrib. Statist. link
Abstract
The maximum likelihood estimate of the correlation parameter of a Gaussian process with and without of a nugget term is studied in the case of the analysis of deterministic models. - A. Pepelyshev (2009)
Improvement of random LHD for high dimensions. Proceedings of the 6th St. Petersburg Workshop on Simulation, 1091-1096. link
Abstract
Designs of experiments for multivariate case are reviewed. Fast algorithm of construction of good Latin hypercube designs is developed. - A. Pepelyshev (2010)
Fixed-domain asymptotics of the maximum likelihood estimator and the Gaussian process approach for deterministic models. in press. link
Abstract
The fixed-domain asymptotics of the maximum likelihood estimator is studied in the framework of the Gaussian process approach for data collected as precise observations of a deterministic computer model given by an analytic function. It is shown that the maximum likelihood estimator of the correlation parameter of a Gaussian process does not converge to a finite value and the computational stability strongly depends on the type of the correlation function. In particular, computations are the most unstable for the Gaussian correlation function, which is typically used in the analysis of computer experiments, and significantly less unstable for the stable correlation function rho(t)=e^{-|t|^\gamma} even if gamma=1.9 which is close to 2.