Re-HGNM: a repeat aware hypergraph neural machine for session-based recommendation
Hypergraph neural network (HGNN) for session-based recommendation (SBR) is quite rare but has been rewarded with promising performance. However, under the hypergraph framework, no works have emphasized the importance of repeat consumption, which is an important consumption pattern in SBR (e.g., peop...
Gespeichert in:
Veröffentlicht in: | Neural computing & applications 2024-06, Vol.36 (17), p.9661-9674 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Hypergraph neural network (HGNN) for session-based recommendation (SBR) is quite rare but has been rewarded with promising performance. However, under the hypergraph framework, no works have emphasized the importance of repeat consumption, which is an important consumption pattern in SBR (e.g., people watch movies or listen to music repeatedly over time) and limits the performance of SBR in a beyond graph topology manner. For filling this gap and better recommendation performance, this paper incorporates repeat consumption into HGNNs and proposes a new recommendation machine, named Re-HGNM, which leverages the powerful expressiveness of HGNNs in representation learning and takes repeat consumption into account by considering the relationship between repeat behaviors and modeling repeat recommendations. Specially, in the encoder of Re-HGNM, we process sessions as hyperedges, in which the repeated behaviors are merged, to constitute hypergraph that conveys not only the complex relations between unique items but the relationships between repeat behaviors. After learning item representation by hypergraph convolution, a repeat-explore module is employed to determine whether to make repeat recommendation. In repeat mode, the next click must be in old items; thereby, the shrunk prediction space makes it easier to find the target item. Therefore, in the decoding stage, two decoders correspondingly work under two separate modes. Finally, compared with the current best baseline, Re-HGNM relatively improved P@20 by 58.37%, 8.70%, 1.28%, and 1.16% and MRR@20 by 25.52%, 10.31%, 10.76%, and 2.76% on the Tmall, RetailRocket, Nowplaying, and Diginetica, to be the current state-of-the-art model for SBR. |
---|---|
ISSN: | 0941-0643 1433-3058 |
DOI: | 10.1007/s00521-023-08985-0 |