StyleStudio: Text-Driven Style Transfer with Selective Control of Style Elements
Text-driven style transfer aims to merge the style of a reference image with content described by a text prompt. Recent advancements in text-to-image models have improved the nuance of style transformations, yet significant challenges remain, particularly with overfitting to reference styles, limiti...
Gespeichert in:
Hauptverfasser: | , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Text-driven style transfer aims to merge the style of a reference image with
content described by a text prompt. Recent advancements in text-to-image models
have improved the nuance of style transformations, yet significant challenges
remain, particularly with overfitting to reference styles, limiting stylistic
control, and misaligning with textual content. In this paper, we propose three
complementary strategies to address these issues. First, we introduce a
cross-modal Adaptive Instance Normalization (AdaIN) mechanism for better
integration of style and text features, enhancing alignment. Second, we develop
a Style-based Classifier-Free Guidance (SCFG) approach that enables selective
control over stylistic elements, reducing irrelevant influences. Finally, we
incorporate a teacher model during early generation stages to stabilize spatial
layouts and mitigate artifacts. Our extensive evaluations demonstrate
significant improvements in style transfer quality and alignment with textual
prompts. Furthermore, our approach can be integrated into existing style
transfer frameworks without fine-tuning. |
---|---|
DOI: | 10.48550/arxiv.2412.08503 |