Exposure manipulation strategies for balancing computational efficiency and precision in seismic risk analysis

Exposure models for regional seismic risk assessment often place assets at the centroids of administrative units for which data are available. At best, a top-down approach is followed, where such data are spatially disaggregated over a denser spatial grid, using proxy datasets such as the distributi...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Bulletin of earthquake engineering 2024-07, Vol.22 (9), p.4779-4795
Hauptverfasser: Papadopoulos, Athanasios N., Roth, Philippe, Danciu, Laurentiu
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Exposure models for regional seismic risk assessment often place assets at the centroids of administrative units for which data are available. At best, a top-down approach is followed, where such data are spatially disaggregated over a denser spatial grid, using proxy datasets such as the distribution of population or the density of night-time lights. The resolution of the spatial grid is either dictated by the resolution of the proxy dataset, or by constraints in computational resources. On the other hand, if a building-by-building database is available, it often needs to be aggregated and brought to a resolution that ensures acceptable calculation runtimes and memory demands. Several studies have now investigated the impact of exposure aggregation on loss estimates. Herein, unlike previous attempts, we can leverage upon an extensive building-by-building database for the Swiss territory, which we can use as ground truth. We firstly proceed to assess the aggregation-induced errors of standard risk metrics at different spatial scales. Then a new strategy for performing said aggregation is proposed, relying on a K-means clustering of site parameters and a reduction of the loss ratio uncertainty for aggregated assets. These interventions are designed with the objective of minimizing errors, while keeping the computational cost manageable.
ISSN:1570-761X
1573-1456
DOI:10.1007/s10518-024-01929-6