A target-free video structural motion estimation method based on multi-path optimization
The vibration data are quite important for structural health monitoring (SHM). This paper proposed a novel method, to adaptively estimate video motions of the structure in subpixel accuracy, without attaching any targets. The proposed method includes three steps. In the first step, to remove outlier...
Gespeichert in:
Veröffentlicht in: | Mechanical systems and signal processing 2023-09, Vol.198, p.110452, Article 110452 |
---|---|
Hauptverfasser: | , , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | The vibration data are quite important for structural health monitoring (SHM). This paper proposed a novel method, to adaptively estimate video motions of the structure in subpixel accuracy, without attaching any targets. The proposed method includes three steps. In the first step, to remove outliers and simultaneously preserve feature points, the Gaussian range kernel is used along with the Gaussian spatial kernel, and calculated by the polynomial fitting and recursive integrals computing. In the second step, to calculate video pixel motions varied with spatial coordinates in the region of interest (ROI) for testing, the ROI is divided into multiple grid cells. Motions in each grid cell are modeled as local spatially-variant homography matrices, and their spatial consistency are enhanced by a shape-preserving constraint. The third step is to enhance both spatial and temporal correlations of the calculated homography matrices, achieved by the data term and the smoothness term in both space and time domains. The superiority of the proposed method over traditional methods was validated in several case studies for analyzing structural motions. Among the comparisons, the proposed method can produce image denoising, camera motions, structural motions, and structural modal information in subpixel accuracy, and with the best accuracy. |
---|---|
ISSN: | 0888-3270 1096-1216 |
DOI: | 10.1016/j.ymssp.2023.110452 |