Large graph layout optimization based on vision and computational efficiency: a survey
Graph layout can help users explore graph data intuitively. However, when handling large graph data volumes, the high time complexity of the layout algorithm and the overlap of visual elements usually lead to a significant decrease in analysis efficiency and user experience. Increasing computing spe...
Gespeichert in:
Veröffentlicht in: | Visual Intelligence 2023-07, Vol.1 (1), Article 14 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Graph layout can help users explore graph data intuitively. However, when handling large graph data volumes, the high time complexity of the layout algorithm and the overlap of visual elements usually lead to a significant decrease in analysis efficiency and user experience. Increasing computing speed and improving visual quality of large graph layouts are two key approaches to solving these problems. Previous surveys are mainly conducted from the aspects of specific graph type, layout techniques and layout evaluation, while seldom concentrating on layout optimization. The paper reviews the recent works on the optimization of the visual and computational efficiency of graphs, and establishes a taxonomy according to the stage when these methods are implemented: pre-layout, in-layout and post-layout. The pre-layout methods focus on graph data compression techniques, which involve graph filtering and graph aggregation. The in-layout approaches optimize the layout process from computing architecture and algorithms, where deep learning techniques are also included. Visual mapping and interactive layout adjustment are post-layout optimization techniques. Our survey reviews the current research on large graph layout optimization techniques in different stages of the layout design process, and presents possible research challenges and opportunities in the future. |
---|---|
ISSN: | 2731-9008 2731-9008 |
DOI: | 10.1007/s44267-023-00007-w |