Multi-Label Learning Based on Transfer Learning and Label Correlation

In recent years, multi-label learning has received a lot of attention. However, most of the existing methods only consider global label correlation or local label correlation. In fact, on the one hand, both global and local label correlations can appear in real-world situation at same time. On the o...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Computers, materials & continua materials & continua, 2019-01, Vol.61 (1), p.155-169
Hauptverfasser: Yang, Kehua, She, Chaowei, Zhang, Wei, Yao, Jiqing, Long, Shaosong
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:In recent years, multi-label learning has received a lot of attention. However, most of the existing methods only consider global label correlation or local label correlation. In fact, on the one hand, both global and local label correlations can appear in real-world situation at same time. On the other hand, we should not be limited to pairwise labels while ignoring the high-order label correlation. In this paper, we propose a novel and effective method called GLLCBN for multi-label learning. Firstly, we obtain the global label correlation by exploiting label semantic similarity. Then, we analyze the pairwise labels in the label space of the data set to acquire the local correlation. Next, we build the original version of the label dependency model by global and local label correlations. After that, we use graph theory, probability theory and Bayesian networks to eliminate redundant dependency structure in the initial version model, so as to get the optimal label dependent model. Finally, we obtain the feature extraction model by adjusting the Inception V3 model of convolution neural network and combine it with the GLLCBN model to achieve the multi-label learning. The experimental results show that our proposed model has better performance than other multi-label learning methods in performance evaluating.
ISSN:1546-2226
1546-2218
1546-2226
DOI:10.32604/cmc.2019.05901