Self-Stabilized Distributed Network Distance Prediction

The network distance service obtains the network latency among large-scale nodes. With increasing numbers of participating nodes, the network distance service has to balance the accuracy and the scalability. The network-coordinate methods scale well by embedding the pairwise latency into a low-dimen...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE/ACM transactions on networking 2017-02, Vol.25 (1), p.451-464
Hauptverfasser: Fu, Yongquan, Xiaoping, Xu
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:The network distance service obtains the network latency among large-scale nodes. With increasing numbers of participating nodes, the network distance service has to balance the accuracy and the scalability. The network-coordinate methods scale well by embedding the pairwise latency into a low-dimensional coordinate system. The prediction errors are iteratively optimized by adjusting the coordinates with respect to neighbors. Unfortunately, the optimization process is vulnerable to the inaccurate coordinates, leading to destabilized positions. In this paper, we propose RMF, a relative coordinate-based distributed sparse-preserving matrix-factorization method to provide guaranteed stability for the coordinate system. In RMF, each node maintains a low-rank square matrix that is incrementally adjusted with respect to its neighbors' relative coordinates. The optimization is self-stabilizing, guaranteeing to converge and not interfered by inaccurate coordinates, since the relative coordinates do not have computational errors. By exploiting the sparse structure of the square matrix, the optimization enforces the L_{1} -norm regularization to preserve the sparseness of the square matrix. Simulation results and a PlanetLab-based experiment confirm that RMF converges to stable positions within 10 to 15 rounds, and decreases the prediction errors by 10% to 20%.
ISSN:1063-6692
1558-2566
DOI:10.1109/TNET.2016.2581592