Noise-Adaptive State Estimators with Change-Point Detection

Aiming at tracking sharply maneuvering targets, this paper develops novel variational adaptive state estimators for joint target state and process noise parameter estimation for a class of linear state-space models with abruptly changing parameters. By combining variational inference with change-poi...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Sensors (Basel, Switzerland) Switzerland), 2024-07, Vol.24 (14), p.4585
Hauptverfasser: Hou, Xiaolei, Zhao, Shijie, Hu, Jinjie, Lan, Hua
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Aiming at tracking sharply maneuvering targets, this paper develops novel variational adaptive state estimators for joint target state and process noise parameter estimation for a class of linear state-space models with abruptly changing parameters. By combining variational inference with change-point detection in an online Bayesian fashion, two adaptive estimators-a change-point-based adaptive Kalman filter (CPAKF) and a change-point-based adaptive Kalman smoother (CPAKS)-are proposed in a recursive detection and estimation process. In each iteration, the run-length probability of the current maneuver mode is first calculated, and then the joint posterior of the target state and process noise parameter conditioned on the run length is approximated by variational inference. Compared with existing variational noise-adaptive Kalman filters, the proposed methods are robust to initial iterative value settings, improving their capability of tracking sharply maneuvering targets. Meanwhile, the change-point detection divides the non-stationary time sequence into several stationary segments, allowing for an adaptive sliding length in the CPAKS method. The tracking performance of the proposed methods is investigated using both synthetic and real-world datasets of maneuvering targets.
ISSN:1424-8220
1424-8220
DOI:10.3390/s24144585