Efficient detection and robust tracking of spermatozoa in microscopic video

Sperm concentration and motility are generally analysed only in the discrete state in microscopic videos. As for sperm nonspecific aggregation areas, it brings difficulties to accurate sperm detection. In this paper, an algorithm for nonspecific aggregates automatic segmentation, detection and track...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IET image processing 2021-11, Vol.15 (13), p.3200-3210
Hauptverfasser: Zhu, Ronghua, Cui, Yansong, Hou, Enyu, Huang, Jianming
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Sperm concentration and motility are generally analysed only in the discrete state in microscopic videos. As for sperm nonspecific aggregation areas, it brings difficulties to accurate sperm detection. In this paper, an algorithm for nonspecific aggregates automatic segmentation, detection and tracking of sperm is proposed. A grid model commensurate with the size of a sperm head is created to segment nonspecific aggregation areas. Multi‐scale edge function and new energy functional are designed based on the level set method to realize sperm head segmentation. In the sperm tracking stage, we improve the weight condition and the standard of trust flow quantization based on graph theory method, and simplify the sperm tracking to the vertex matching between two frames to solve the matching failure problem of adjacent frames with small space distance. The proposed method achieves accurate segmentation of sperm non‐specific aggregation regions, which outperforms the level set methods of LBF and SBGFR. At the same time, our method can real‐time calculation of sperm concentration and motility during sperm tracking. It is compared with four state‐of‐the‐art algorithms, and it gives lower tracking error rates, which has potential applications in male fertility field.
ISSN:1751-9659
1751-9667
DOI:10.1049/ipr2.12316