Blind and Semi-Blind Deblurring of Natural Images

A method for blind image deblurring is presented. The method only makes weak assumptions about the blurring filter and is able to undo a wide variety of blurring degradations. To overcome the ill-posedness of the blind image deblurring problem, the method includes a learning technique which initiall...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on image processing 2010-01, Vol.19 (1), p.36-52
Hauptverfasser: Almeida, M.S.C., Almeida, L.B.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:A method for blind image deblurring is presented. The method only makes weak assumptions about the blurring filter and is able to undo a wide variety of blurring degradations. To overcome the ill-posedness of the blind image deblurring problem, the method includes a learning technique which initially focuses on the main edges of the image and gradually takes details into account. A new image prior, which includes a new edge detector, is used. The method is able to handle unconstrained blurs, but also allows the use of constraints or of prior information on the blurring filter, as well as the use of filters defined in a parametric manner. Furthermore, it works in both single-frame and multiframe scenarios. The use of constrained blur models appropriate to the problem at hand, and/or of multiframe scenarios, generally improves the deblurring results. Tests performed on monochrome and color images, with various synthetic and real-life degradations, without and with noise, in single-frame and multiframe scenarios, showed good results, both in subjective terms and in terms of the increase of signal to noise ratio (ISNR) measure. In comparisons with other state of the art methods, our method yields better results, and shows to be applicable to a much wider range of blurs.
ISSN:1057-7149
1941-0042
DOI:10.1109/TIP.2009.2031231