Image restoration for synthetic aperture systems with a non-blind deconvolution algorithm via a deep convolutional neural network

Optical synthetic aperture imaging systems, which consist of in-phase circular sub-mirrors, can greatly improve the spatial resolution of a space telescope. Due to the sub-mirrors' dispersion and sparsity, the modulation transfer function is decreased significantly compared to a fully filled ap...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Optics express 2020-03, Vol.28 (7), p.9929-9943
Hauptverfasser: Hui, Mei, Wu, Yong, Li, Weiqian, Liu, Ming, Dong, Liquan, Kong, Lingqin, Zhao, Yuejin
Format: Artikel
Sprache:eng
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Optical synthetic aperture imaging systems, which consist of in-phase circular sub-mirrors, can greatly improve the spatial resolution of a space telescope. Due to the sub-mirrors' dispersion and sparsity, the modulation transfer function is decreased significantly compared to a fully filled aperture system, which causes obvious blurring and loss of contrast in the collected image. Image restoration is the key to get the ideal clear image. In this paper, an appropriative non-blind deconvolution algorithm for image restoration of optical synthetic aperture systems is proposed. A synthetic aperture convolutional neural network (CNN) is trained as a denoiser prior to restoring the image. By improving the half-quadratic splitting algorithm, the image restoration process is divided into two subproblems: deconvolution and denoising. The CNN is able to remove noise in the gradient domain and the learned gradients are then used to guide the image deconvolution step. Compared with several conventional algorithms, scores of evaluation indexes of the proposed method are the highest. When the signal to noise ratio is 40 dB, the average peak signal to noise ratio is raised from 23.7 dB of the degraded images to 30.8 dB of the restored images. The structural similarity index of the results is increased from 0.78 to 0.93. Both quantitative and qualitative evaluations demonstrate that the proposed method is effective.
ISSN:1094-4087
1094-4087
DOI:10.1364/OE.387623