Neural attention model for recommendation based on factorization machines
In recommendation systems, it is of vital importance to comprehensively consider various aspects of information to make accurate recommendations for users. When the low-order feature interactions between items are insufficient, it is necessary to mine information to learn higher-order feature intera...
Gespeichert in:
Veröffentlicht in: | Applied intelligence (Dordrecht, Netherlands) Netherlands), 2021-04, Vol.51 (4), p.1829-1844 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | In recommendation systems, it is of vital importance to comprehensively consider various aspects of information to make accurate recommendations for users. When the low-order feature interactions between items are insufficient, it is necessary to mine information to learn higher-order feature interactions. In addition, to distinguish the different importance levels of feature interactions, larger weights should be assigned to features with larger contributions to predictions, and smaller weights to those with smaller contributions. Therefore, this paper proposes a neural attention model for recommendation (NAM), which deepens factorization machines (FMs) by adding an attention mechanism and fully connected layers. Through the attention mechanism, NAM can learn the different importance levels of low-order feature interactions. By adding fully connected layers on top of the attention component, NAM can model high-order feature interactions in a nonlinear way. Experiments on two real-world datasets demonstrate that NAM has excellent performance and is superior to FM and other state-of-the-art models. The results demonstrate the effectiveness of the proposed model and the potential of using neural networks for prediction under sparse data. |
---|---|
ISSN: | 0924-669X 1573-7497 |
DOI: | 10.1007/s10489-020-01921-y |