Graph convolutional network and self-attentive for sequential recommendation

Sequential recommender systems (SRS) aim to provide personalized recommendations to users in the context of large-scale datasets and complex user behavior sequences. However, the effectiveness of most existing embedding techniques in capturing the intricate relationships between items remains subopt...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:PeerJ. Computer science 2023-12, Vol.9, p.e1701-e1701, Article e1701
Hauptverfasser: Guo, Kaifeng, Zeng, Guolei
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Sequential recommender systems (SRS) aim to provide personalized recommendations to users in the context of large-scale datasets and complex user behavior sequences. However, the effectiveness of most existing embedding techniques in capturing the intricate relationships between items remains suboptimal, with a significant concentration of item embedding vectors that hinder the improvement of final prediction performance. Nevertheless, our study reveals that the distribution of item embeddings can be effectively dispersed through graph interaction networks and contrastive learning. In this article, we propose a graph convolutional neural network to capture the complex relationships between users and items, leveraging the learned embedding vectors of nodes to represent items. Additionally, we employ a self-attentive sequential model to predict outcomes based on the item embedding sequences of individual users. Furthermore, we incorporate instance-wise contrastive learning (ICL) and prototype contrastive learning (PCL) during the training process to enhance the effectiveness of representation learning. Broad comparative experiments and ablation studies were conducted across four distinct datasets. The experimental outcomes clearly demonstrate the superior performance of our proposed GSASRec model.
ISSN:2376-5992
2376-5992
DOI:10.7717/peerj-cs.1701