On Non-negative Matrix Factorization Using Gaussian Kernels as Covariates
The observation matrix is approximated by a non-negative matrix factorization as the product of the base matrix and the coefficient matrix. Since the coefficient vector differs between individuals, the coefficient matrix is represented as the product of the parameter and covariate matrices. In gener...
Gespeichert in:
Veröffentlicht in: | Ouyou toukeigaku 2023, Vol.52(2), pp.59-74 |
---|---|
1. Verfasser: | |
Format: | Artikel |
Sprache: | eng ; jpn |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | The observation matrix is approximated by a non-negative matrix factorization as the product of the base matrix and the coefficient matrix. Since the coefficient vector differs between individuals, the coefficient matrix is represented as the product of the parameter and covariate matrices. In general, the use of a covariate matrix reduces the accuracy of the approximation, but in this paper, the use of a Gaussian kernel for the covariates suppresses this reduction.Gaussian kernels provide smooth coefficients and predictions for changes in covariates, making them easy to interpret. The usefulness of the proposed method is demonstrated by applying it to text data and longitudinal measures and comparing it to the case where the covariates are not used. |
---|---|
ISSN: | 0285-0370 1883-8081 |
DOI: | 10.5023/jappstat.52.59 |