Physics-Informed Neural State Space Models via Learning and Evolution
Recent works exploring deep learning application to dynamical systems modeling have demonstrated that embedding physical priors into neural networks can yield more effective, physically-realistic, and data-efficient models. However, in the absence of complete prior knowledge of a dynamical system...
Gespeichert in:
Hauptverfasser: | , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Recent works exploring deep learning application to dynamical systems
modeling have demonstrated that embedding physical priors into neural networks
can yield more effective, physically-realistic, and data-efficient models.
However, in the absence of complete prior knowledge of a dynamical system's
physical characteristics, determining the optimal structure and optimization
strategy for these models can be difficult. In this work, we explore methods
for discovering neural state space dynamics models for system identification.
Starting with a design space of block-oriented state space models and
structured linear maps with strong physical priors, we encode these components
into a model genome alongside network structure, penalty constraints, and
optimization hyperparameters. Demonstrating the overall utility of the design
space, we employ an asynchronous genetic search algorithm that alternates
between model selection and optimization and obtains accurate physically
consistent models of three physical systems: an aerodynamics body, a continuous
stirred tank reactor, and a two tank interacting system. |
---|---|
DOI: | 10.48550/arxiv.2011.13497 |