event

## Primary tabs

In recent years, computer experiments have become popular in engineering and scientific applications. The rapidly increasing use of computer models poses great challenges in the design, modeling and analysis of computer experiments. This thesis focuses on developing new methodologies that would meet some of the challenges in the field of computer experiments. It consists of four chapters.

In Chapter 1, a new approach is taken to integrate data from approximate and detailed simulations to build a surrogate model that describes the relationship between output and input parameters. Experimental results from approximate simulations form the bulk of the data, and they are used to build a model based on a Gaussian process. The fitted model is then "adjusted" by incorporating a small amount of data from detailed simulations to obtain a more accurate prediction model. The effectiveness of this approach is demonstrated with an example involving the design of cellular materials for an electronics cooling application.

Chapter 2 deals with the development of a new Bayesian procedure for integrating low-accuracy and high-accuracy experiments. Standard practice in analyzing data from different types of experiments is to treat data from each type separately. By borrowing strength across multiple sources, an integrated analysis can produce better results. To this end, some Bayesian hierarchical Gaussian process models are proposed. The proposed method is illustrated with two examples: one with detailed and approximate finite elements simulations for mechanical material design and the other with physical and computer experiments for a fluidized bed process in the food industry to coat certain food products with additives.

Chapter 3 is devoted to the introduction of a structural equation method for the temperature modeling in data center computer experiment. Temperature modeling is a key in designing and running a reliable data center with many computer components operating constantly and generating heat. How different configurations affect the data center thermal distribution is largely unknown. This is because the physical thermal process is complex, depending on many factors, and detailed temperature measurements are not monitored in actual data centers. It is possible to build physics-based mathematical models, implemented in computer code, to study the air movement and temperature distribution mechanisms. A statistical method based on latent variables is introduced for analyzing the multivariate temperature readings produced by the computer experiment. A two-stage estimation procedure is developed for the proposed latent variable model by making use of sufficient statistics and pseud-likelihood method.

Construction of designs for multiple experiments with different levels of accuracy is a new issue in design of experiments because traditional methods deal almost exclusively with a single experiment. In Chapter 4, a method is proposed for constructing nested space-filling designs for this type of multiple experiments. The construction is aided by the use of Galois field and orthogonal arrays. Multiple design sets generated by the proposed method are guaranteed to have some space-filling property.

Status

• Workflow Status:
Published
• Created By:
Barbara Christopher
• Created:
10/08/2010
• Modified By:
Fletcher Moore
• Modified:
10/07/2016

Categories

Keywords

No keywords were submitted.

Target Audience

No target audience selected.