Accelerated nested sampling with \(\beta\)-flows for gravitational waves
There is an ever-growing need in the gravitational wave community for fast and reliable inference methods, accompanied by an informative error bar. Nested sampling satisfies the last two requirements, but its computational cost can become prohibitive when using the most accurate waveform models. In...
Gespeichert in:
Veröffentlicht in: | arXiv.org 2024-11 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | There is an ever-growing need in the gravitational wave community for fast and reliable inference methods, accompanied by an informative error bar. Nested sampling satisfies the last two requirements, but its computational cost can become prohibitive when using the most accurate waveform models. In this paper, we demonstrate the acceleration of nested sampling using a technique called posterior repartitioning. This method leverages nested sampling's unique ability to separate prior and likelihood contributions at the algorithmic level. Specifically, we define a `repartitioned prior' informed by the posterior from a low-resolution run. To construct this repartitioned prior, we use a \(\beta\)-flow, a novel type of conditional normalizing flow designed to better learn deep tail probabilities. \(\beta\)-flows are trained on the entire nested sampling run and conditioned on an inverse temperature \(\beta\). Applying our methods to simulated and real binary black hole mergers, we demonstrate how they can reduce the number of likelihood evaluations required for convergence by up to an order of magnitude, enabling faster model comparison and parameter estimation. Furthermore, we highlight the robustness of using \(\beta\)-flows over standard normalizing flows to accelerate nested sampling. Notably, \(\beta\)-flows successfully recover the same posteriors and evidences as traditional nested sampling, even in cases where standard normalizing flows fail. |
---|---|
ISSN: | 2331-8422 |