Managing Redundant Content in Bandwidth Constrained Wireless Networks
Images/videos are often uploaded in situations like disasters. This can tax the network in terms of increased load and thereby upload latency, and this can be critical for response activities. In such scenarios, prior work has shown that there is significant redundancy in the content (e.g., similar...
Gespeichert in:
Veröffentlicht in: | IEEE/ACM transactions on networking 2017-04, Vol.25 (2), p.988-1003 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Images/videos are often uploaded in situations like disasters. This can tax the network in terms of increased load and thereby upload latency, and this can be critical for response activities. In such scenarios, prior work has shown that there is significant redundancy in the content (e.g., similar photos taken by users) transferred. By intelligently suppressing/deferring transfers of redundant content, the load can be significantly reduced, thereby facilitating the timely delivery of unique, possibly critical information. A key challenge here however, is detecting "what content is similar," given that the content is generated by uncoordinated user devices. Toward addressing this challenge, we propose a framework, wherein a service to which the content is to be uploaded first solicits metadata (e.g., image features) from any device uploading content. By intelligently comparing this metadata with that associated with previously uploaded content, the service effectively identifies (and thus enables the suppression of) redundant content. Our evaluations on a testbed of 20 Android smartphones and via ns3 simulations show that we can identify similar content with a 70% true positive rate and a 1% false positive rate. The resulting reduction in redundant content transfers translates to a latency reduction of 44 % for unique content. |
---|---|
ISSN: | 1063-6692 1558-2566 |
DOI: | 10.1109/TNET.2016.2616901 |