Stochastic Coded Caching with Optimized Shared-Cache Sizes and Reduced Subpacketization

This work studies the $K$-user broadcast channel with $\Lambda$ caches, when the association between users and caches is random, i.e., for the scenario where each user can appear within the coverage area of -- and subsequently is assisted by -- a specific cache based on a given probability distribut...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Malik, Adeel, Serbetci, Berksan, Elia, Petros
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:This work studies the $K$-user broadcast channel with $\Lambda$ caches, when the association between users and caches is random, i.e., for the scenario where each user can appear within the coverage area of -- and subsequently is assisted by -- a specific cache based on a given probability distribution. Caches are subject to a cumulative memory constraint that is equal to $t$ times the size of the library. We provide a scheme that consists of three phases: the storage allocation phase, the content placement phase, and the delivery phase, and show that an optimized storage allocation across the caches together with a modified uncoded cache placement and delivery strategy alleviates the adverse effect of cache-load imbalance by significantly reducing the multiplicative performance deterioration due to randomness. In a nutshell, our work provides a scheme that manages to substantially mitigate the impact of cache-load imbalance in stochastic networks, as well as -- compared to the best-known state-of-the-art -- the well-known subpacketization bottleneck by showing its applicability in deterministic settings for which it achieves the same delivery time -- which was proven to be close to optimal for bounded values of $t$ -- with an exponential reduction in the subpacketization.
DOI:10.48550/arxiv.2112.14114