Restorer: Removing Multi-Degradation with All-Axis Attention and Prompt Guidance
There are many excellent solutions in image restoration.However, most methods require on training separate models to restore images with different types of degradation.Although existing all-in-one models effectively address multiple types of degradation simultaneously, their performance in real-worl...
Gespeichert in:
Hauptverfasser: | , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | There are many excellent solutions in image restoration.However, most methods
require on training separate models to restore images with different types of
degradation.Although existing all-in-one models effectively address multiple
types of degradation simultaneously, their performance in real-world scenarios
is still constrained by the task confusion problem.In this work, we attempt to
address this issue by introducing \textbf{Restorer}, a novel Transformer-based
all-in-one image restoration model.To effectively address the complex
degradation present in real-world images, we propose All-Axis Attention (AAA),
a mechanism that simultaneously models long-range dependencies across both
spatial and channel dimensions, capturing potential correlations along all
axes.Additionally, we introduce textual prompts in Restorer to incorporate
explicit task priors, enabling the removal of specific degradation types based
on user instructions. By iterating over these prompts, Restorer can handle
composite degradation in real-world scenarios without requiring additional
training.Based on these designs, Restorer with one set of parameters
demonstrates state-of-the-art performance in multiple image restoration tasks
compared to existing all-in-one and even single-task models.Additionally,
Restorer is efficient during inference, suggesting the potential in real-world
applications. |
---|---|
DOI: | 10.48550/arxiv.2406.12587 |