Spectral Neural Operators

A plentitude of applications in scientific computing requires the approximation of mappings between Banach spaces. Recently introduced Fourier Neural Operator (FNO) and Deep Operator Network (DeepONet) can provide this functionality. For both of these neural operators, the input function is sampled...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:arXiv.org 2024-04
Hauptverfasser: Fanaskov, V, Oseledets, I
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:A plentitude of applications in scientific computing requires the approximation of mappings between Banach spaces. Recently introduced Fourier Neural Operator (FNO) and Deep Operator Network (DeepONet) can provide this functionality. For both of these neural operators, the input function is sampled on a given grid (uniform for FNO), and the output function is parametrized by a neural network. We argue that this parametrization leads to 1) opaque output that is hard to analyze and 2) systematic bias caused by aliasing errors in the case of FNO. The alternative, advocated in this article, is to use Chebyshev and Fourier series for both domain and codomain. The resulting Spectral Neural Operator (SNO) has transparent output, never suffers from aliasing, and may include many exact (lossless) operations on functions. The functionality is based on well-developed fast, and stable algorithms from spectral methods. The implementation requires only standard numerical linear algebra. Our benchmarks show that for many operators, SNO is superior to FNO and DeepONet.
ISSN:2331-8422