Robust Multi-Label Learning with PRO Loss

Multi-label learning methods assign multiple labels to one object. In practice, in addition to differentiating relevant labels from irrelevant ones, it is often desired to rank relevant labels for an object, whereas the ranking of irrelevant labels is not important. Thus, we require an algorithm to...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on knowledge and data engineering 2020-08, Vol.32 (8), p.1610-1624
Hauptverfasser: Xu, Miao, Li, Yu-Feng, Zhou, Zhi-Hua
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Multi-label learning methods assign multiple labels to one object. In practice, in addition to differentiating relevant labels from irrelevant ones, it is often desired to rank relevant labels for an object, whereas the ranking of irrelevant labels is not important. Thus, we require an algorithm to do classification and ranking of relevant labels simultaneously. Such a requirement, however, cannot be met because most existing methods were designed to optimize existing criteria, yet there is no criterion which encodes the aforementioned requirement. In this paper, we present a new criterion, PRO Loss , concerning the prediction of all labels as well as the ranking of only relevant labels. We then propose ProSVM which optimizes PRO Loss efficiently using alternating direction method of multipliers. We further improve its efficiency with an upper approximation that reduces the number of constraints from O(T^2) O(T2) to O(T) O(T) , where T T is the number of labels. We then notice that in real applications, it is difficult to get full supervised information for multi-label data. To make the proposed algorithm more robust to supervised information, we adapt ProSVM to deal with the multi-label learning with partial labels problem. Experiments show that our proposal is not only superior on PRO Loss , but also highly competitive on existing evaluation criteria.
ISSN:1041-4347
1558-2191
DOI:10.1109/TKDE.2019.2908898