Self-supervised learning based on Transformer for flow reconstruction and prediction
Machine learning has great potential for efficient reconstruction and prediction of flow fields. However, existing datasets may have highly diversified labels for different flow scenarios, which are not applicable for training a model. To this end, we make a first attempt to apply the self-supervise...
Gespeichert in:
Veröffentlicht in: | Physics of fluids (1994) 2024-02, Vol.36 (2) |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Machine learning has great potential for efficient reconstruction and prediction of flow fields. However, existing datasets may have highly diversified labels for different flow scenarios, which are not applicable for training a model. To this end, we make a first attempt to apply the self-supervised learning (SSL) technique to fluid dynamics, which disregards data labels for pre-training the model. The SSL technique embraces a large amount of data (8000 snapshots) at Reynolds numbers of Re = 200, 300, 400, and 500 without discriminating between them, which improves the generalization of the model. The Transformer model is pre-trained via a specially designed pretext task, where it reconstructs the complete flow fields after randomly masking 20% data points in each snapshot. For the downstream task of flow reconstruction, the pre-trained model is fine-tuned separately with 256 snapshots for each Reynolds number. The fine-tuned models accurately reconstruct the complete flow fields based on less than 5% random data points within a limited window even for Re = 250 and 600, whose data were not seen in the pre-trained phase. For the other downstream task of flow prediction, the pre-training model is fine-tuned separately with 128 consecutive snapshot pairs for each corresponding Reynolds number. The fine-tuned models then correctly predict the evolution of the flow fields over many periods of cycles. We compare all results generated by models trained via SSL and models trained via supervised learning, where the former has unequivocally superior performance. We expect that the methodology presented here will have wider applications in fluid mechanics. |
---|---|
ISSN: | 1070-6631 1089-7666 |
DOI: | 10.1063/5.0188998 |