Kernelized support tensor train machines
•The tensorial data structure is useful and it is better to keep the data structure instead of vectorizing the data.•Support vector machine is extended to a kernelized support tensor train machine, which accepts tensorial input directly.•Tensor train based kernel mapping scheme is proposed and the v...
Gespeichert in:
Veröffentlicht in: | Pattern recognition 2022-02, Vol.122, p.108337, Article 108337 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | •The tensorial data structure is useful and it is better to keep the data structure instead of vectorizing the data.•Support vector machine is extended to a kernelized support tensor train machine, which accepts tensorial input directly.•Tensor train based kernel mapping scheme is proposed and the validity proof of the kernel mapping is also proved.•Proposing a data decomposition scheme to make sure that similar tensors have similar kernel mappings in the feature space.•Doable to apply different kernel functions on different tensorial data modes.
Tensor, a multi-dimensional data structure, has been exploited recently in the machine learning community. Traditional machine learning approaches are vector- or matrix-based, and cannot handle tensorial data directly. In this paper, we propose a tensor train (TT)-based kernel technique for the first time, and apply it to the conventional support vector machine (SVM) for high-dimensional image classification with very small number of training samples. Specifically, we propose a kernelized support tensor train machine that accepts tensorial input and preserves the intrinsic kernel property. The main contributions are threefold. First, we propose a TT-based feature mapping procedure that maintains the TT structure in the feature space. Second, we demonstrate two ways to construct the TT-based kernel function while considering consistency with the TT inner product and preservation of information. Third, we show that it is possible to apply different kernel functions on different data modes. In principle, our method tensorizes the standard SVM on its input structure and kernel mapping scheme. This reduces the storage and computation complexity of kernel matrix construction from exponential to polynomial. The validity proof and computation complexity of the proposed TT-based kernel functions are provided elaborately. Extensive experiments are performed on high-dimensional fMRI and color images datasets, which demonstrates the superiority of the proposed scheme compared with the state-of-the-art techniques. |
---|---|
ISSN: | 0031-3203 1873-5142 |
DOI: | 10.1016/j.patcog.2021.108337 |