Building Flexible Machine Learning Models for Scientific Computing at Scale
Foundation models have revolutionized language modeling, while whether this success is replicated in scientific computing remains unexplored. We present OmniArch, the first prototype aiming at solving multi-scale and multi-physics scientific computing problems with physical alignment. We addressed a...
Gespeichert in:
Hauptverfasser: | , , , , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Foundation models have revolutionized language modeling, while whether this
success is replicated in scientific computing remains unexplored. We present
OmniArch, the first prototype aiming at solving multi-scale and multi-physics
scientific computing problems with physical alignment. We addressed all three
challenges with one unified architecture. Its pre-training stage contains a
Fourier Encoder-decoder fading out the disharmony across separated dimensions
and a Transformer backbone integrating quantities through temporal dynamics,
and the novel PDE-Aligner performs physics-informed fine-tuning under flexible
conditions. As far as we know, we first conduct 1D-2D-3D united pre-training on
the PDEBench, and it sets not only new performance benchmarks for 1D, 2D, and
3D PDEs but also demonstrates exceptional adaptability to new physics via
in-context and zero-shot learning approaches, which supports realistic
engineering applications and foresight physics discovery. |
---|---|
DOI: | 10.48550/arxiv.2402.16014 |