Sequential Recommendation with Relation-Aware Kernelized Self-Attention
AAAI 2020 Recent studies identified that sequential Recommendation is improved by the attention mechanism. By following this development, we propose Relation-Aware Kernelized Self-Attention (RKSA) adopting a self-attention mechanism of the Transformer with augmentation of a probabilistic model. The...
Gespeichert in:
Hauptverfasser: | , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | AAAI 2020 Recent studies identified that sequential Recommendation is improved by the
attention mechanism. By following this development, we propose Relation-Aware
Kernelized Self-Attention (RKSA) adopting a self-attention mechanism of the
Transformer with augmentation of a probabilistic model. The original
self-attention of Transformer is a deterministic measure without
relation-awareness. Therefore, we introduce a latent space to the
self-attention, and the latent space models the recommendation context from
relation as a multivariate skew-normal distribution with a kernelized
covariance matrix from co-occurrences, item characteristics, and user
information. This work merges the self-attention of the Transformer and the
sequential recommendation by adding a probabilistic model of the recommendation
task specifics. We experimented RKSA over the benchmark datasets, and RKSA
shows significant improvements compared to the recent baseline models. Also,
RKSA were able to produce a latent space model that answers the reasons for
recommendation. |
---|---|
DOI: | 10.48550/arxiv.1911.06478 |