Energy Efficiency Strategy for Big Data in Cloud Environment Using Deep Reinforcement Learning

Big data entails massive cloud resources for data processing and analysis, which consumes more energy to run. The resources and tasks are increasing exponentially in the cloud environment for the processing of big data, which results in an increment in power consumption to run the cloud data center....

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Mobile information systems 2022-08, Vol.2022, p.1-11
Hauptverfasser: Pandey, Neeraj Kumar, Diwakar, Manoj, Shankar, Achyut, Singh, Prabhishek, Khosravi, Mohammad R., Kumar, Vivek
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Big data entails massive cloud resources for data processing and analysis, which consumes more energy to run. The resources and tasks are increasing exponentially in the cloud environment for the processing of big data, which results in an increment in power consumption to run the cloud data center. So, there is always a scope for optimizing the energy utilization in cloud data centers. This paper presents a visionary architecture in a cloud environment for big data with a proposed energy-efficient strategy based on LSTM-DQN (long-short-term memory-deep Q network) using reinforcement learning (RL). The traditional techniques are not so efficient when the tasks are allocated dynamically, and the generic RL strategies are not able to store the data iterated in the last cycles of processing, so the LSTM is considered for this purpose. In the proposed model, integration of DPSO and DQN is used for better estimation and rectification of the curse of dimensionality. The proposed strategy is compared with different variants of PSO (particle swarm optimization) such as DPSO and QoS-PSO. The improvement in results through proposed model is recoded over the algorithm such as load aware (8.01%), DQN (13.36%), EA-DQN (34.16%), L-No-DEAF (15.62%), DPSO (62.68%), QoS-PSO (72.69%), FFO-EVSM (75.42%), and MIMT (76.39%) on the parameter of energy efficiency, tasks completion time, and energy consumption over the timeline. So, the proposed model is encouraging in the energy-efficient cloud environment for big data with the challenges that the technological world is facing and the emergence of deep learning as one propitious field.
ISSN:1574-017X
1875-905X
DOI:10.1155/2022/8716132