First-order methods and automatic differentiation: A multi-step systems identification perspective
This paper presents a tool for multi-step system identification that leverages first-order optimization and exact gradient computation. Drawing inspiration from neural network training and Automatic Differentiation (AD), the proposed method computes and analyzes the gradients with respect to the par...
Gespeichert in:
Hauptverfasser: | , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | This paper presents a tool for multi-step system identification that
leverages first-order optimization and exact gradient computation. Drawing
inspiration from neural network training and Automatic Differentiation (AD),
the proposed method computes and analyzes the gradients with respect to the
parameters to identify by propagating them through system dynamics. Thus, it
defines a linear, time-varying dynamical system that models the gradient
evolution. This allows to formally address the "exploding gradient" issue, by
providing conditions for a reliable and efficient optimization and
identification process for dynamical systems. Results indicate that the
proposed method is both effective and efficient, making it a promising tool for
future research and applications in nonlinear systems identification and
non-convex optimization. |
---|---|
DOI: | 10.48550/arxiv.2410.03544 |