Machine learning-based tsunami inundation prediction derived from offshore observations
The world’s largest and densest tsunami observing system gives us the leverage to develop a method for a real-time tsunami inundation prediction based on machine learning. Our method utilizes 150 offshore stations encompassing the Japan Trench to simultaneously predict tsunami inundation at seven co...
Gespeichert in:
Veröffentlicht in: | Nature communications 2022-09, Vol.13 (1), p.5489-5489, Article 5489 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | The world’s largest and densest tsunami observing system gives us the leverage to develop a method for a real-time tsunami inundation prediction based on machine learning. Our method utilizes 150 offshore stations encompassing the Japan Trench to simultaneously predict tsunami inundation at seven coastal cities stretching ~100 km along the southern Sanriku coast. We trained the model using 3093 hypothetical tsunami scenarios from the megathrust (
M
w 8.0–9.1) and nearby outer-rise (
M
w 7.0–8.7) earthquakes. Then, the model was tested against 480 unseen scenarios and three near-field historical tsunami events. The proposed machine learning-based model can achieve comparable accuracy to the physics-based model with ~99% computational cost reduction, thus facilitates a rapid prediction and an efficient uncertainty quantification. Additionally, the direct use of offshore observations can increase the forecast lead time and eliminate the uncertainties typically associated with a tsunami source estimate required by the conventional modeling approach.
One of the main challenges in the tsunami inundation prediction is related to the real-time computational efforts done under restrictive time constraints. Here the authors show that using machine learning-based model, we can achieve comparable accuracy to the physics-based model with ~99% computational cost reduction. |
---|---|
ISSN: | 2041-1723 2041-1723 |
DOI: | 10.1038/s41467-022-33253-5 |