Efficient and robust phase unwrapping method based on SFNet

Phase unwrapping is a crucial step in obtaining the final physical information in the field of optical metrology. Although good at dealing with phase with discontinuity and noise, most deep learning-based spatial phase unwrapping methods suffer from the complex model and unsatisfactory performance,...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Optics express 2024-04, Vol.32 (9), p.15410-15432
Hauptverfasser: Zhang, Ziheng, Wang, Xiaoxu, Liu, Chengxiu, Han, Ziyu, Xiao, Qingxiong, Zhang, Zhilin, Feng, Wenlu, Liu, Mingyong, Lu, Qianbo
Format: Artikel
Sprache:eng
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Phase unwrapping is a crucial step in obtaining the final physical information in the field of optical metrology. Although good at dealing with phase with discontinuity and noise, most deep learning-based spatial phase unwrapping methods suffer from the complex model and unsatisfactory performance, partially due to simple noise type for training datasets and limited interpretability. This paper proposes a highly efficient and robust spatial phase unwrapping method based on an improved SegFormer network, SFNet. The SFNet structure uses a hierarchical encoder without positional encoding and a decoder based on a lightweight fully connected multilayer perceptron. The proposed method utilizes the self-attention mechanism of the Transformer to better capture the global relationship of phase changes and reduce errors in the phase unwrapping process. It has a lower parameter count, speeding up the phase unwrapping. The network is trained on a simulated dataset containing various types of noise and phase discontinuity. This paper compares the proposed method with several state-of-the-art deep learning-based and traditional methods in terms of important evaluation indices, such as RMSE and PFS, highlighting its structural stability, robustness to noise, and generalization.
ISSN:1094-4087
1094-4087
DOI:10.1364/OE.517676