Automatic Example-Based Image Colorization Using Location-Aware Cross-Scale Matching

Given a reference color image and a destination grayscale image, this paper presents a novel automatic colorization algorithm that transfers color information from the reference image to the destination image. Since the reference and destination images may contain content at different or even varyin...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on image processing 2019-09, Vol.28 (9), p.4606-4619
Hauptverfasser: Bo Li, Yu-Kun Lai, John, Matthew, Rosin, Paul L.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Given a reference color image and a destination grayscale image, this paper presents a novel automatic colorization algorithm that transfers color information from the reference image to the destination image. Since the reference and destination images may contain content at different or even varying scales (due to changes of distance between objects and the camera), existing texture matching-based methods can often perform poorly. We propose a novel cross-scale texture matching method to improve the robustness and quality of the colorization results. Suitable matching scales are considered locally, which are then fused using global optimization that minimizes both the matching errors and spatial change of scales. The minimization is efficiently solved using a multi-label graph-cut algorithm. Since only low-level texture features are used, texture matching-based colorization can still produce semantically incorrect results, such as meadow appearing above the sky. We consider a class of semantic violation where the statistics of up-down relationships learned from the reference image are violated and propose an effective method to identify and correct unreasonable colorization. Finally, a novel nonlocal ℓ 1 optimization framework is developed to propagate high confidence micro-scribbles to regions of lower confidence to produce a fully colorized image. Qualitative and quantitative evaluations show that our method outperforms several state-of-the-art methods.
ISSN:1057-7149
1941-0042
DOI:10.1109/TIP.2019.2912291