Research on style transfer for multiple regions

The exciting method of creating unique visual experiences through composing a complex interplay between the content and style of an image has been extended to art works, creative design, video processing and other fields. Image style transfer technology is used to create images colorful styles autom...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Multimedia tools and applications 2022-02, Vol.81 (5), p.7183-7200
Hauptverfasser: Yang, Wang, Zhenxin, Yu, Haiyan, Long
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:The exciting method of creating unique visual experiences through composing a complex interplay between the content and style of an image has been extended to art works, creative design, video processing and other fields. Image style transfer technology is used to create images colorful styles automatically. Most of the existing researches focus on the style transfer of the whole image or a single region in the image, which is inevitably not adequate for practical applications. In this work, we introduce an approach of differential stylization for image different areas. Considering the human visual attention mechanism, the saliency regions in the training image data set are labeled, and the salient regions are trained in the semantic segmentation model. The structure of the neural style transfer model is simplified to improve the operation efficiency. In our approach, each local target region in the image is stylized evenly and carefully. Different regions are well integrated to achieve more realistic and pleasing effect, while more dominant operation efficiency is achieved. We separately perform experiments with the Cityscapes and the Microsoft COCO 2017 database. The performance is also compared with some reported methods and shown improved, while considering the accuracy and efficiency as performance metrics.
ISSN:1380-7501
1573-7721
DOI:10.1007/s11042-022-12121-w