Hierarchical coded caching with heterogeneous cache sizes
In this paper, we delve into the intricacies of a network architecture enabled by two layers of caches, where users receive their requested content via intermediate helpers connected to a central server. While coded caching in a two-layer hierarchical model has previously demonstrated its potential...
Gespeichert in:
Veröffentlicht in: | Wireless networks 2024-05, Vol.30 (4), p.2001-2016 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | In this paper, we delve into the intricacies of a network architecture enabled by two layers of caches, where users receive their requested content via intermediate helpers connected to a central server. While coded caching in a two-layer hierarchical model has previously demonstrated its potential to enhance data rates when cache capacities are uniform and no coordination exists between users, our work takes a leap further. We introduce the dimension of heterogeneous cache sizes among both the helpers and users, addressing scenarios where the number of popular files can be either less or more than the number of users within the network. Leveraging a recently proposed modified coded caching scheme, combined with a zero-padding technique, we present novel results on data rates, supported by an illustrative example. Our contribution extends to the formulation of two distinct coded schemes within the hierarchical scenario. Furthermore, we optimize the proportions of files and memories allocated to each scheme, facilitating data transfer efficiency and then derive the lower bound for the total rate. Moreover, we demonstrate that the total rate achieved by the proposed heterogeneous approach is lower than that of a homogeneous network with caches equivalent to the minimum cache size present in the homogeneous network. However, it is more than a homogeneous network with the similar average cache size. In addition, we illustrate by proper selection of the proportions of files and memories allocated to each scheme, we can decrease the performance degradation due to the heterogeneity of the network. |
---|---|
ISSN: | 1022-0038 1572-8196 |
DOI: | 10.1007/s11276-023-03620-1 |