Full Bayesian identification of linear dynamic systems using stable kernels

System identification learns mathematical models of dynamic systems starting from input-output data. Despite its long history, such research area is still extremely active. New challenges are posed by identification of complex physical processes given by the interconnection of dynamic systems. Examp...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Proceedings of the National Academy of Sciences - PNAS 2023-05, Vol.120 (18), p.e2218197120-e2218197120
Hauptverfasser: Pillonetto, G, Ljung, L
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:System identification learns mathematical models of dynamic systems starting from input-output data. Despite its long history, such research area is still extremely active. New challenges are posed by identification of complex physical processes given by the interconnection of dynamic systems. Examples arise in biology and industry, e.g., in the study of brain dynamics or sensor networks. In the last years, regularized kernel-based identification, with inspiration from machine learning, has emerged as an interesting alternative to the classical approach commonly adopted in the literature. In the linear setting, it uses the class of stable kernels to include fundamental features of physical dynamical systems, e.g., smooth exponential decay of impulse responses. Such class includes also unknown parameters, called hyperparameters, which play a similar role as the model order in controlling complexity. In this paper, we develop a linear system identification procedure by casting stable kernels in a full Bayesian framework. Our models incorporate hyperparameters uncertainty and consist of a mixture of dynamic systems over a continuum spectrum of dimensions. They are obtained by overcoming drawbacks related to classical Markov chain Monte Carlo schemes that, when applied to stable kernels, are proved to become nearly reducible (i.e., unable to reconstruct posteriors of interest in reasonable time). Numerical experiments show that full Bayes frequently outperforms the state-of-the-art results on typical benchmark problems. Two real applications related to brain dynamics (neural activity) and sensor networks are also included.
ISSN:0027-8424
1091-6490
1091-6490
DOI:10.1073/pnas.2218197120