Application of multi-level adaptive neural network based on optimization algorithm in image style transfer

Arbitrary image style transfer is the process of inputting any set of images to generate images with a certain artistic style. Aiming at the problem of how to adapt both global style and local style and maintain spatial consistency based on the arbitrary style transfer algorithm. This paper proposed...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Multimedia tools and applications 2024-02, Vol.83 (29), p.73127-73149
Hauptverfasser: Li, Hong-an, Wang, Lanye, Liu, Jun
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Arbitrary image style transfer is the process of inputting any set of images to generate images with a certain artistic style. Aiming at the problem of how to adapt both global style and local style and maintain spatial consistency based on the arbitrary style transfer algorithm. This paper proposed a multi-level adaptive arbitrary style transfer network and adopted a multi-level strategy to integrate multi-level context information in a progressive manner. First, the convolution block attention module C B A M is referenced to the encoder to improve the semantic matching of the algorithm and maintain spatial consistency. Secondly, the multi-branch content is integrated with the style features, quantifying the local similarity between content and style features in a non-local way, rearranges the distribution of style representation according to the content representation. Finally, the multi-layer features after alignment are provided to the decoder module by the Adaptive Weight Skip Connection A W S C , which can integrate local and global styles efficiently and flexibly. In addition, the identity loss is used to eliminate image artifacts and better retain the content structure information. Qualitative and quantitative experiments show that the proposed method is superior to the most advanced CNN-based method, and can generate high-quality stylized images with arbitrary styles and better visual effects.
ISSN:1573-7721
1380-7501
1573-7721
DOI:10.1007/s11042-024-18451-1