Enhancing Information Freshness and Energy Efficiency in D2D Networks Through DRL-Based Scheduling and Resource Management

This paper investigates resource management in device-to-device (D2D) networks coexisting with cellular user equipment (CUEs). We introduce a novel model for joint scheduling and resource management in D2D networks, taking into account environmental constraints. To preserve information freshness, me...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE open journal of vehicular technology 2025, Vol.6, p.52-67
Hauptverfasser: Parhizgar, Parisa, Mahdavi, Mehdi, Ahmadzadeh, Mohammad Reza, Erol-Kantarci, Melike
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:This paper investigates resource management in device-to-device (D2D) networks coexisting with cellular user equipment (CUEs). We introduce a novel model for joint scheduling and resource management in D2D networks, taking into account environmental constraints. To preserve information freshness, measured by minimizing the average age of information (AoI), and to effectively utilize energy harvesting (EH) technology to satisfy the network's energy needs, we formulate an online optimization problem. This formulation considers factors such as the quality of service (QoS) for both CUEs and D2Ds, available power, information freshness, and environmental sensing requirements. Due to the mixed-integer nonlinear nature and online characteristics of the problem, we propose a deep reinforcement learning (DRL) approach to solve it effectively. Numerical results show that the proposed joint scheduling and resource management strategy, utilizing the soft actor-critic (SAC) algorithm, reduces the average AoI by 20% compared to other baseline methods.
ISSN:2644-1330
2644-1330
DOI:10.1109/OJVT.2024.3502803