Multiview Variational Sparse Gaussian Processes

Gaussian process (GP) models are flexible nonparametric models widely used in a variety of tasks. Variational sparse GP (VSGP) scales GP models to large data sets by summarizing the posterior process with a set of inducing points. In this article, we extend VSGP to handle multiview data. We model ea...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transaction on neural networks and learning systems 2021-07, Vol.32 (7), p.2875-2885
Hauptverfasser: Mao, Liang, Sun, Shiliang
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Gaussian process (GP) models are flexible nonparametric models widely used in a variety of tasks. Variational sparse GP (VSGP) scales GP models to large data sets by summarizing the posterior process with a set of inducing points. In this article, we extend VSGP to handle multiview data. We model each view with a VSGP and augment it with an additional set of inducing points. These VSGPs are coupled together by enforcing the means of their posteriors to agree at the locations of these inducing points. To learn these shared inducing points, we introduce an additional GP model that is defined in the concatenated feature space. Experiments on real-world data sets show that our multiview VSGP (MVSGP) model outperforms single-view VSGP consistently and is superior to state-of-the-art kernel-based multiview baselines for classification tasks.
ISSN:2162-237X
2162-2388
DOI:10.1109/TNNLS.2020.3008496