Edge‐aware image outpainting with attentional generative adversarial networks

Image outpainting aims at extending the field of view of an existing image. While image inpainting has achieved great success with the deep learning technology, image outpainting still receive less attention. The main challenge is how to generate high‐quality extended images with clear texture and h...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IET image processing 2022-05, Vol.16 (7), p.1807-1821
Hauptverfasser: Li, Xiaoming, Zhang, Hengzhi, Feng, Lei, Hu, Jing, Zhang, Rongguo, Qiao, Qiang
Format: Artikel
Sprache:eng
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Image outpainting aims at extending the field of view of an existing image. While image inpainting has achieved great success with the deep learning technology, image outpainting still receive less attention. The main challenge is how to generate high‐quality extended images with clear texture and highly consistent semantic information. In order to solve the problem of the effect of invalid pixels on the generated image and the distance of effective pixels is too far. This paper proposes a two‐stage image outpainting method (the EA method), which consists of an edge generation stage and an edge transformation stage. In this paper, the convolutional block attention module (CBAM) is introduced into the generation network to focus on spatial and channel feature and the improved VAE‐GAN structure is used to generate the extended image for more realistic semantics. The EA method is evaluated on the CelebA, Pairs‐streetview and homemade landscapes dataset, and show that the results contain high‐quality textures as well as faithfully extend the semantics. The average PSNR, SSIM, FID index of the EA method on the three datasets is 22.7961, 0.7061, 6.8553 and show that it outperforms existing algorithms in both quantitative and qualitative analysis.
ISSN:1751-9659
1751-9667
DOI:10.1049/ipr2.12447