RBSRICNN: Raw Burst Super-Resolution through Iterative Convolutional Neural Network
Modern digital cameras and smartphones mostly rely on image signal processing (ISP) pipelines to produce realistic colored RGB images. However, compared to DSLR cameras, low-quality images are usually obtained in many portable mobile devices with compact camera sensors due to their physical limitati...
Gespeichert in:
Hauptverfasser: | , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Modern digital cameras and smartphones mostly rely on image signal processing
(ISP) pipelines to produce realistic colored RGB images. However, compared to
DSLR cameras, low-quality images are usually obtained in many portable mobile
devices with compact camera sensors due to their physical limitations. The
low-quality images have multiple degradations i.e., sub-pixel shift due to
camera motion, mosaick patterns due to camera color filter array,
low-resolution due to smaller camera sensors, and the rest information are
corrupted by the noise. Such degradations limit the performance of current
Single Image Super-resolution (SISR) methods in recovering high-resolution (HR)
image details from a single low-resolution (LR) image. In this work, we propose
a Raw Burst Super-Resolution Iterative Convolutional Neural Network (RBSRICNN)
that follows the burst photography pipeline as a whole by a forward (physical)
model. The proposed Burst SR scheme solves the problem with classical image
regularization, convex optimization, and deep learning techniques, compared to
existing black-box data-driven methods. The proposed network produces the final
output by an iterative refinement of the intermediate SR estimates. We
demonstrate the effectiveness of our proposed approach in quantitative and
qualitative experiments that generalize robustly to real LR burst inputs with
onl synthetic burst data available for training. |
---|---|
DOI: | 10.48550/arxiv.2110.13217 |