Addressing niche demand based on joint mobility prediction and content popularity caching

We present an efficient mobility-based proactive caching model for addressing niche mobile demand, along with popularity-based and legacy caching model extensions. Opposite to other proactive solutions which focus on popular content, we propose a distributed solution that targets less popular, perso...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Computer networks (Amsterdam, Netherlands : 1999) Netherlands : 1999), 2016-12, Vol.110, p.306-323
Hauptverfasser: Vasilakos, Xenofon, Siris, Vasilios A., Polyzos, George C.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:We present an efficient mobility-based proactive caching model for addressing niche mobile demand, along with popularity-based and legacy caching model extensions. Opposite to other proactive solutions which focus on popular content, we propose a distributed solution that targets less popular, personalised or dynamic content requests by prefetching data in small cells based on aggregated user mobility prediction information. According to notable studies, niche demand, particularly for video content, represents a significant 20–40% of Internet demand and follows a growing trend. Due to its novel design, our model can directly address such demand, while also make a joint use of content popularity information with the novelty of dynamically tuning the contribution of mobility prediction and content popularity on local cache actions. Based on thorough performance evaluation simulations after exploring different demand levels, video catalogues and mobility scenarios including human walking and automobile mobility, we show that gains from mobility prediction can be high and able to adapt well to temporal locality due to the short timescale of measurements, exceeding cache gains from popularity-only caching up to  41% for low caching demand scenarios. Our model’s performance can be further improved at the cost of an added computational overhead by adapting cache replacements by, e.g. in the aforementioned scenarios,  41%. Also, we find that it is easier to benefit from requests popularity with low mobile caching demand and that mobility-based gains grow with popularity skewness, approaching close to the high and robust gains yielded with the model extensions.
ISSN:1389-1286
1872-7069
DOI:10.1016/j.comnet.2016.10.001