Edge2vec: Edge-based Social Network Embedding

Graph embedding, also known as network embedding and network representation learning, is a useful technique which helps researchers analyze information networks through embedding a network into a low-dimensional space. However, existing graph embedding methods are all node-based, which means they ca...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:ACM transactions on knowledge discovery from data 2020-08, Vol.14 (4), p.1-24
Hauptverfasser: Wang, Changping, Wang, Chaokun, Wang, Zheng, Ye, Xiaojun, Yu, Philip S.
Format: Artikel
Sprache:eng
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Graph embedding, also known as network embedding and network representation learning, is a useful technique which helps researchers analyze information networks through embedding a network into a low-dimensional space. However, existing graph embedding methods are all node-based, which means they can just directly map the nodes of a network to low-dimensional vectors while the edges could only be mapped to vectors indirectly. One important reason is the computational cost, because the number of edges is always far greater than the number of nodes. In this article, considering an important property of social networks, i.e., the network is sparse, and hence the average degree of nodes is bounded, we propose an edge-based graph embedding ( edge2vec ) method to map the edges in social networks directly to low-dimensional vectors. Edge2vec takes both the local and the global structure information of edges into consideration to preserve structure information of embedded edges as much as possible. To achieve this goal, edge2vec first ingeniously combines the deep autoencoder and Skip-gram model through a well-designed deep neural network. The experimental results on different datasets show edge2vec benefits from the direct mapping in preserving the structure information of edges.
ISSN:1556-4681
1556-472X
DOI:10.1145/3391298