Deep Generative Adversarial Network for Occlusion Removal from a Single Image
Nowadays, the enhanced capabilities of in-expensive imaging devices have led to a tremendous increase in the acquisition and sharing of multimedia content over the Internet. Despite advances in imaging sensor technology, annoying conditions like \textit{occlusions} hamper photography and may deterio...
Gespeichert in:
Hauptverfasser: | , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Nowadays, the enhanced capabilities of in-expensive imaging devices have led
to a tremendous increase in the acquisition and sharing of multimedia content
over the Internet. Despite advances in imaging sensor technology, annoying
conditions like \textit{occlusions} hamper photography and may deteriorate the
performance of applications such as surveillance, detection, and recognition.
Occlusion segmentation is difficult because of scale variations, illumination
changes, and so on. Similarly, recovering a scene from foreground occlusions
also poses significant challenges due to the complexity of accurately
estimating the occluded regions and maintaining coherence with the surrounding
context. In particular, image de-fencing presents its own set of challenges
because of the diverse variations in shape, texture, color, patterns, and the
often cluttered environment. This study focuses on the automatic detection and
removal of occlusions from a single image. We propose a fully automatic,
two-stage convolutional neural network for fence segmentation and occlusion
completion. We leverage generative adversarial networks (GANs) to synthesize
realistic content, including both structure and texture, in a single shot for
inpainting. To assess zero-shot generalization, we evaluated our trained
occlusion detection model on our proposed fence-like occlusion segmentation
dataset. The dataset can be found on GitHub. |
---|---|
DOI: | 10.48550/arxiv.2409.13242 |