Hierarchical Refinement of Latin Hypercube Samples
In this article, a novel method for the extension of sample size in Latin Hypercube Sampling (LHS) is suggested. The method can be applied when an initial LH design is employed for the analysis of functions g of a random vector. The article explains how the statistical, sensitivity and reliability a...
Gespeichert in:
Veröffentlicht in: | Computer-aided civil and infrastructure engineering 2015-05, Vol.30 (5), p.394-411 |
---|---|
1. Verfasser: | |
Format: | Artikel |
Sprache: | eng |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | In this article, a novel method for the extension of sample size in Latin Hypercube Sampling (LHS) is suggested. The method can be applied when an initial LH design is employed for the analysis of functions g of a random vector. The article explains how the statistical, sensitivity and reliability analyses of g can be divided into a hierarchical sequence of simulations with subsets of samples of a random vector in such a way that (i) the favorable properties of LHS are retained (the low number of simulations needed for statistically significant estimations of statistical parameters of function g with low estimation variability); (ii) the simulation process can be halted, for example, when the estimations reach a certain prescribed statistical significance. An important aspect of the method is that it efficiently simulates subsets of samples of random vectors while focusing on their correlation structure or any other objective function such as some measure of dependence, spatial distribution uniformity, discrepancy, etc. This is achieved by employing a robust algorithm based on combinatorial optimization of the mutual ordering of samples. The method is primarily intended to serve as a tool for computationally intensive evaluations of g where there is a need for pilot numerical studies, preliminary and subsequently refined estimations of statistical parameters, optimization of the progressive learning of neural networks, or during experimental design. |
---|---|
ISSN: | 1093-9687 1467-8667 |
DOI: | 10.1111/mice.12088 |