How to Calibrate a Dynamical System With Neural Network Based Physics?
Unlike the traditional subgrid scale parameterizations used in climate models, current machine learning (ML) parameterizations are only tuned offline, by minimizing a loss function on outputs from high‐resolution models. This approach often leads to numerical instabilities and long‐term biases. Here...
Gespeichert in:
Veröffentlicht in: | Geophysical research letters 2022-04, Vol.49 (8), p.1-n/a |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Unlike the traditional subgrid scale parameterizations used in climate models, current machine learning (ML) parameterizations are only tuned offline, by minimizing a loss function on outputs from high‐resolution models. This approach often leads to numerical instabilities and long‐term biases. Here, we propose a method to design tunable ML parameterizations and calibrate them online. The calibration of the ML parameterization is achieved in two steps. First, some model parameters are included within the ML model input. This ML model is fitted at once for a range of values of the parameters, using an offline metric. Second, once the ML parameterization has been plugged into the climate model, the parameters included among the ML inputs are optimized with respect to an online metric quantifying errors on long‐term statistics. We illustrate our method with two simple dynamical systems. Our approach significantly reduces long‐term biases of the ML model.
Plain Language Summary
In numerical climate models, processes occurring at scales smaller than the model resolution (e.g., convection, turbulence) need to be represented by “parameterizations.” Parameterizations provide a simplified yet numerically affordable version of the modeled processes. Recently, parameterizations are also developed using machine learning (ML) by fitting to outputs from high resolution climate models. This method can lead to long‐term biases when incorporating the ML parameterizations into the climate model. And, there is no possibility in the current approach to calibrate the ML parameterization to alleviate these biases. We propose here an innovative approach to calibrate ML parameterizations once they have been fitted to a learning sample. Our approach has been successfully tested on two toy models. A first set of experiments focus on the retrieval of the value of parameters used to generate a reference data set. In the second experiment, the value of some parameters not included in the neural network (NN) has been biased, resulting in errors in long‐term statistics. Finding the optimal value of the NN input parameter has significantly improved the accuracy of the resulting model. Our method could be applied to improve the prediction of long‐term variables in climate models.
Key Points
Tunable parameters are included to the inputs of a neural network (NN) parameterization
The tunable NN parameters are optimized by using a kriging method
Long‐term statistical properties of the NN‐base |
---|---|
ISSN: | 0094-8276 1944-8007 |
DOI: | 10.1029/2022GL097872 |