A new video magnification technique using complex wavelets with Radon transform application

Magnifying micro-movements of natural videos that are undetectable by human eye has recently received considerable interests, due to its impact in numerous applications. In this paper, we use dual tree complex wavelet transform (DT-CWT), to analyze video frames in order to detect and magnify micro-m...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Signal, image and video processing image and video processing, 2018-11, Vol.12 (8), p.1505-1512
Hauptverfasser: Fahmy, Omar M., Fahmy, Gamal, Fahmy, Mamdouh F.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Magnifying micro-movements of natural videos that are undetectable by human eye has recently received considerable interests, due to its impact in numerous applications. In this paper, we use dual tree complex wavelet transform (DT-CWT), to analyze video frames in order to detect and magnify micro-movements to make them visible. We use DT-CWT, due to its excellent edge-preserving and nearly-shift invariant features. In order to detect any minor change in object’s spatial position, the paper proposes to modify the phases of the CWT coefficients decomposition of successive video frames. Furthermore, the paper applies Radon transform to track frame micro-movements without any temporal band-pass filtering. The paper starts by presenting a simple technique to design orthogonal filters that construct this CWT system. Next, it is shown that modifying the phase differences between the CWT coefficients of arbitrary frame and a reference one results in image spatial magnification. This in turn, makes these micro-movements seen and observable. Several simulation results are given, to show that the proposed technique competes very well to the existing micro-magnification approaches. In fact, as it manages to yield superior video quality in far less computation time.
ISSN:1863-1703
1863-1711
DOI:10.1007/s11760-018-1306-9