Closing the Loop: Joint Rain Generation and Removal via Disentangled Image Translation
Existing deep learning-based image deraining methods have achieved promising performance for synthetic rainy images, typically rely on the pairs of sharp images and simulated rainy counterparts. However, these methods suffer from significant performance drop when facing the real rain, because of the...
Gespeichert in:
Hauptverfasser: | , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Existing deep learning-based image deraining methods have achieved promising
performance for synthetic rainy images, typically rely on the pairs of sharp
images and simulated rainy counterparts. However, these methods suffer from
significant performance drop when facing the real rain, because of the huge gap
between the simplified synthetic rain and the complex real rain. In this work,
we argue that the rain generation and removal are the two sides of the same
coin and should be tightly coupled. To close the loop, we propose to jointly
learn real rain generation and removal procedure within a unified disentangled
image translation framework. Specifically, we propose a bidirectional
disentangled translation network, in which each unidirectional network contains
two loops of joint rain generation and removal for both the real and synthetic
rain image, respectively. Meanwhile, we enforce the disentanglement strategy by
decomposing the rainy image into a clean background and rain layer (rain
removal), in order to better preserve the identity background via both the
cycle-consistency loss and adversarial loss, and ease the rain layer
translating between the real and synthetic rainy image. A counterpart
composition with the entanglement strategy is symmetrically applied for rain
generation. Extensive experiments on synthetic and real-world rain datasets
show the superiority of proposed method compared to state-of-the-arts. |
---|---|
DOI: | 10.48550/arxiv.2103.13660 |