The Robot Crawler Model on Complete k-Partite and Erd\H{o}s-R\'enyi Random Graphs
Web crawlers are used by internet search engines to gather information about the web graph. In this paper we investigate a simple process which models such software by walking around the vertices of a graph. Once initial random vertex weights have been assigned, the robot crawler traverses the graph...
Gespeichert in:
Hauptverfasser: | , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Web crawlers are used by internet search engines to gather information about
the web graph. In this paper we investigate a simple process which models such
software by walking around the vertices of a graph. Once initial random vertex
weights have been assigned, the robot crawler traverses the graph
deterministically following a greedy algorithm, always visiting the neighbour
of least weight and then updating this weight to be the highest overall. We
consider the maximum, minimum and average number of steps taken by the crawler
to visit every vertex of firstly, complete k-partite graphs and secondly,
sparse Erd\H{o}s-R\'enyi random graphs. Our work follows on from a paper of
Bonato et. al. who introduced the model. |
---|---|
DOI: | 10.48550/arxiv.1702.08371 |