SR-DSGA: Session Recommendation for Dual Sequence Based on Graph Neural Network and Multi-Attention
Session recommender system (SRS) captures user's sequential features based on historical behavior to predict the next-clicked item. The accuracy of extracting user's session features directly determines the key performance of SRS. Existing session recommendation methods have two flaws: 1)...
Gespeichert in:
Veröffentlicht in: | IEEE access 2024, Vol.12, p.109380-109387 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Session recommender system (SRS) captures user's sequential features based on historical behavior to predict the next-clicked item. The accuracy of extracting user's session features directly determines the key performance of SRS. Existing session recommendation methods have two flaws: 1) ignore the complex connections between items, i.e. represent them in a relatively isolated manner; 2) neglect the transition patterns between attributes of items. To address these issues, we propose a novel session recommendation model named SR-DSGA (Session Recommendation for Dual Sequence based on Graph neural network and multi-attention). Firstly, SR-DSGA adopts message passing mechanism in graph neural network to get non-isolated item embedding representations with specific semantic relationship by item-level explicit sequence modeling. Secondly, SR-DSGA exploits the Transformer's multi-head self-attention mechanism to indirectly obtain item embedding representations in another way through item attribute-level implicit sequence modeling. Therefore, SR-DSGA can help extract the fine-grained features with full sequential patterns even in sparse data scenarios. Finally, soft-attention and time threshold are used to acquire user's long-term and short-term preferences respectively. Experimental studies on real-world datasets demonstrate the proposed SR-DSGA model outperforms the state-of-the-art benchmark methods. |
---|---|
ISSN: | 2169-3536 2169-3536 |
DOI: | 10.1109/ACCESS.2024.3440351 |