Pyramid-VAE-GAN: Transferring hierarchical latent variables for image inpainting

Significant progress has been made in image inpainting methods in recent years. However, they are incapable of producing inpainting results with reasonable structures, rich detail, and sharpness at the same time. In this paper, we propose the Pyramid-VAE-GAN network for image inpainting to address t...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Computational Visual Media 2023-12, Vol.9 (4), p.827-841
Hauptverfasser: Tian, Huiyuan, Zhang, Li, Li, Shijian, Yao, Min, Pan, Gang
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Significant progress has been made in image inpainting methods in recent years. However, they are incapable of producing inpainting results with reasonable structures, rich detail, and sharpness at the same time. In this paper, we propose the Pyramid-VAE-GAN network for image inpainting to address this limitation. Our network is built on a variational autoencoder (VAE) backbone that encodes high-level latent variables to represent complicated high-dimensional prior distributions of images. The prior assists in reconstructing reasonable structures when inpainting. We also adopt a pyramid structure in our model to maintain rich detail in low-level latent variables. To avoid the usual incompatibility of requiring both reasonable structures and rich detail, we propose a novel cross-layer latent variable transfer module. This transfers information about long-range structures contained in high-level latent variables to low-level latent variables representing more detailed information. We further use adversarial training to select the most reasonable results and to improve the sharpness of the images. Extensive experimental results on multiple datasets demonstrate the superiority of our method. Our code is available at https://github.com/thy960112/Pyramid-VAE-GAN .
ISSN:2096-0433
2096-0662
DOI:10.1007/s41095-022-0331-3