Two bit correlation-an adaptive time delay estimation

Time delay estimation is a very important operation in ultrasound time-domain flow mapping and correction of phase aberration of an array transducer. As the interest increases in the application of one and a half-dimensional (1.5-D) and two-dimensional (2-D) array transducers to improving image qual...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on ultrasonics, ferroelectrics, and frequency control ferroelectrics, and frequency control, 1996-05, Vol.43 (3), p.473-481
Hauptverfasser: Liang-Min Wang, Shung, K.K., Camps, O.L.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Time delay estimation is a very important operation in ultrasound time-domain flow mapping and correction of phase aberration of an array transducer. As the interest increases in the application of one and a half-dimensional (1.5-D) and two-dimensional (2-D) array transducers to improving image quality and three-dimensional (3-D) imaging, the need of simple, fast, and sufficiently accurate algorithms for real-time time delay estimation becomes exceedingly crucial. In this paper, we present an adaptive time-delay estimation algorithm which minimizes the problem of noise sensitivity associated with the one bit correlation while retaining simplicity in implementation. This algorithm converts each sample datum into a two bit representation including the sign of the sample and an adaptively selected threshold. A bit pattern correlation operation is applied to find the time delay between two engaged signals. By using the criterion of misregistration as an indicator, we are able to show that the proposed algorithm is better than one bit correlation in susceptibility to noise level. Analytical results show that the improvement in reducing misregistration of the two bit correlation over its counterpart is consistent over a wide range of noise level. This is achieved by an adaptive adjustment of the threshold to accommodate signal corruption due to noise. The analytical results are corroborated by results from simulating the blood as a random distribution of red blood cells. Finally, we also present a memory-based architecture to implement the two bit correlation algorithm whose computation time does not depend upon the time delay of the signals to be correlated.
ISSN:0885-3010
1525-8955
DOI:10.1109/58.489407