Recommendations with residual connections and negative sampling based on knowledge graphs

A knowledge graph (KG) contains a large amount of well-structured external triple information that can effectively solve the problems of poor interpretability in collaborative filtering. Recently, recommendation system (RS) models relying on graph neural networks (GNNs) have been widely developed, b...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Knowledge-based systems 2022-12, Vol.258, p.110049, Article 110049
Hauptverfasser: Liu, Yuanyuan, Zhong, Zhaoqian, Che, Chao, Zhu, Yongjun
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:A knowledge graph (KG) contains a large amount of well-structured external triple information that can effectively solve the problems of poor interpretability in collaborative filtering. Recently, recommendation system (RS) models relying on graph neural networks (GNNs) have been widely developed, but the increase of GNN layers inevitably leads to over-smoothing problems. Meanwhile, most of the current KG-based negative sampling strategies randomly collect negative samples from unobserved data to train RS models. However, these strategies are insufficient to generate negative samples reflecting genuine user demands. To overcome these obstacles, we design a model called Knowledge Graph Residual Negative Sampling Recommendation (KGRNS), which utilizes residual connections and pooling operation to alleviate the over-smoothing problem, and generate high-quality negative samples by negative sampling. Specifically, we devise residual connections on each output layer of the GNN and then utilize sum pooling operation to mitigate the effects of the over-smoothing problem on the model. In addition, to generate high-quality negative samples, we create a gated strategy to mix the knowledge of both positive and negative samples to generate synthetic negative samples and then select the virtual negative sample that is closest to the positive ones through a theoretically backed hard negative sample select strategy. We conducted broad experiments on three datasets. The experimental results showed that KGRNS performed considerable enhancements over state-of-the-art methods. Ablation studies validated the effectiveness of each part of the KGRNS. •Utilizing residual connections and sum pooling operation to alleviate the over-smoothing problem.•Designing a gated strategy to mix the knowledge of both positive and negative samples.•Selecting virtual negative sample through we designed negative sample select strategy.•The extensive experiments demonstrate the state-of-the-art performance of KGRNS.
ISSN:0950-7051
1872-7409
DOI:10.1016/j.knosys.2022.110049