Image inpainting based on tensor ring decomposition with generative adversarial network
Image inpainting is a fundamental task in the field of computer vision. However, there are three major challenges associated with this technique: (1) maintaining neighborhood texture consistency; (2) ensuring the rationality of visual structure; and (3) the limitation of existing inpainting models b...
Gespeichert in:
Veröffentlicht in: | Signal, image and video processing image and video processing, 2024, Vol.18 (11), p.7621-7634 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Image inpainting is a fundamental task in the field of computer vision. However, there are three major challenges associated with this technique: (1) maintaining neighborhood texture consistency; (2) ensuring the rationality of visual structure; and (3) the limitation of existing inpainting models based on deep neural networks due to a large number of parameters. In order to tackle these challenges, we propose a novel generative adversarial network based on tensor ring decomposition to inpaint images with varying degrees of damage. First, we design a dual-path block that captures features at different scales without significant memory consumption. Every pair of dual-path blocks is incorporated into an enhanced residual module to integrate local and global features. Additionally, we propose the tensor ring layer to compress the convolution, reducing the number of model parameters and computational complexity. We then use a more accurate U-Net based discriminator to optimize the network by minimizing reconstruction loss, adversarial loss, perceptual loss and style loss. Extensive experiments demonstrate that, when compared with other state-of-the-art algorithms, our model shows superior performance in compression. The repaired images also exhibit reasonable texture structure and contextual semantic information. |
---|---|
ISSN: | 1863-1703 1863-1711 |
DOI: | 10.1007/s11760-024-03415-7 |