A localized ensemble of approximate Gaussian processes for fast sequential emulation

More attention has been given to the computational cost associated with the fitting of an emulator. Substantially less attention is given to the computational cost of using that emulator for prediction. This is primarily because the cost of fitting an emulator is usually far greater than that of obt...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Stat (International Statistical Institute) 2023-01, Vol.12 (1), p.n/a
Hauptverfasser: Rumsey, Kellin N., Huerta, Gabriel, Derek Tucker, J.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:More attention has been given to the computational cost associated with the fitting of an emulator. Substantially less attention is given to the computational cost of using that emulator for prediction. This is primarily because the cost of fitting an emulator is usually far greater than that of obtaining a single prediction, and predictions can often be obtained in parallel. In many settings, especially those requiring Markov Chain Monte Carlo, predictions may arrive sequentially and parallelization is not possible. In this case, using an emulator procedure which can produce accurate predictions efficiently can lead to substantial time savings in practice. In this paper, we propose a global model approximate Gaussian process framework via extension of  a popular local approximate Gaussian process (laGP) framework. Our proposed emulator can be viewed as a treed Gaussian process where the leaf nodes are laGP models, and the tree structure is learned greedily as a function of the prediction stream. The suggested method (called leapGP) has interpretable tuning parameters which control the time‐memory trade‐off. One reasonable choice of settings leads to an emulator with a O(N2) training cost and makes predictions rapidly with an asymptotic amortized cost of O(N).
ISSN:2049-1573
2049-1573
DOI:10.1002/sta4.576