Traffic-Based Side-Channel Attack in Video Streaming

Video streaming takes up an increasing proportion of network traffic nowadays. Dynamic adaptive streaming over HTTP (DASH) becomes the de facto standard of video streaming and it is adopted by Youtube, Netflix, and so on. Despite of the popularity, network traffic during video streaming shows an ide...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE/ACM transactions on networking 2019-06, Vol.27 (3), p.972-985
Hauptverfasser: Gu, Jiaxi, Wang, Jiliang, Yu, Zhiwen, Shen, Kele
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Video streaming takes up an increasing proportion of network traffic nowadays. Dynamic adaptive streaming over HTTP (DASH) becomes the de facto standard of video streaming and it is adopted by Youtube, Netflix, and so on. Despite of the popularity, network traffic during video streaming shows an identifiable pattern which brings threat to user privacy. In this paper, we propose a video identification method using network traffic while streaming. Though there is bitrate adaptation in DASH streaming, we observe that the video bitrate trend remains relatively stable because of the widely used variable bit-rate (VBR) encoding. Accordingly, we design a robust video feature extraction method for eavesdropped video streaming traffic. Meanwhile, we design a VBR-based video fingerprinting method for candidate video set which can be built using downloaded video files. Finally, we propose an efficient partial matching method for computing similarities between video fingerprints and streaming traces to derive video identities. We evaluate our attack method in different scenarios for various video content, segment lengths, and quality levels. The experimental results show that the identification accuracy can reach up to 90% using only three-minute continuous network traffic eavesdropping.
ISSN:1063-6692
1558-2566
DOI:10.1109/TNET.2019.2906568