Neighboring Algorithm for Visual Semantic Analysis toward GAN-Generated Pictures

Generative adversarial network (GAN)-guided visual quality evaluation means scoring GAN-propagated portraits to quantify the degree of visual distortions. In general, there are very few image- and character-evaluation algorithms generated by GAN, and the algorithm’s athletic ability is not capable....

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Applied bionics and biomechanics 2022-09, Vol.2022, p.1-6
Hauptverfasser: Zhang, Lu-Ming, Sheng, Yichuan
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Generative adversarial network (GAN)-guided visual quality evaluation means scoring GAN-propagated portraits to quantify the degree of visual distortions. In general, there are very few image- and character-evaluation algorithms generated by GAN, and the algorithm’s athletic ability is not capable. In this article, we proposed a novel image ranking algorithm based on the nearest neighbor algorithm. It can obtain automatic and extrinsic evaluation of GAN procreate images using an efficient evaluation technique. First, with the support of the artificial neural network, the boundaries of the variety images are extracted to form a homogeneous portrait candidate pool, based on which the comparison of product copies is restricted. Subsequently, with the support of the K-nearest neighbors algorithm, from the unified similarity candidate pool, we extract the most similar concept of K-Emperor to the generated portrait and calculate the portrait quality score accordingly. Finally, the property of generative similarity that produced by the GAN models are trained on a variety of classical datasets. Comprehensive experimental results have shown that our algorithm substantially improves the efficiency and accuracy of the natural evaluation of pictures generated by GAN. The calculated metric is only 1/9–1/28 compared to the other methods. Meanwhile, the objective evaluation of the GAN and human consistency has increased by more than 80% in line with human visual perception.
ISSN:1176-2322
1754-2103
DOI:10.1155/2022/2188152