Multi-View Multi-Instance Learning Based on Joint Sparse Representation and Multi-View Dictionary Learning

In multi-instance learning (MIL), the relations among instances in a bag convey important contextual information in many applications. Previous studies on MIL either ignore such relations or simply model them with a fixed graph structure so that the overall performance inevitably degrades in complex...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on pattern analysis and machine intelligence 2017-12, Vol.39 (12), p.2554-2560
Hauptverfasser: Li, Bing, Yuan, Chunfeng, Xiong, Weihua, Hu, Weiming, Peng, Houwen, Ding, Xinmiao, Maybank, Steve
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:In multi-instance learning (MIL), the relations among instances in a bag convey important contextual information in many applications. Previous studies on MIL either ignore such relations or simply model them with a fixed graph structure so that the overall performance inevitably degrades in complex environments. To address this problem, this paper proposes a novel multi-view multi-instance learning algorithm (M^2 IL) that combines multiple context structures in a bag into a unified framework. The novel aspects are: (i) we propose a sparse \varepsilon -graph model that can generate different graphs with different parameters to represent various context relations in a bag, (ii) we propose a multi-view joint sparse representation that integrates these graphs into a unified framework for bag classification, and (iii) we propose a multi-view dictionary learning algorithm to obtain a multi-view graph dictionary that considers cues from all views simultaneously to improve the discrimination of the M ^2 IL. Experiments and analyses in many practical applications prove the effectiveness of the M ^2 IL.
ISSN:0162-8828
1939-3539
2160-9292
DOI:10.1109/TPAMI.2017.2669303