Research on overall energy consumption optimization method for data center based on deep reinforcement learning
With the rapid development of cloud computing, there are more and more large-scale data centers, which makes the energy management of data centers more complex. In order to achieve better energy-saving effect, it is necessary to solve the problems of concurrent management and interdependence of IT,...
Gespeichert in:
Veröffentlicht in: | Journal of intelligent & fuzzy systems 2023-05, Vol.44 (5), p.7333-7349 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | With the rapid development of cloud computing, there are more and more large-scale data centers, which makes the energy management of data centers more complex. In order to achieve better energy-saving effect, it is necessary to solve the problems of concurrent management and interdependence of IT, refrigeration, storage, and network equipment. Reinforcement learning learns by interacting with the environment, which is a good way to realize the independent management of the data center. In this paper, a overall energy consumption method for data center based on deep reinforcement learning is proposed to achieve collaborative energy saving of data center task scheduling and refrigeration equipment. A new multi-agent architecture is proposed to separate the training process from the execution process, simplify the interaction process during system operation and improve the operation effect. In the deep learning stage, a hybrid deep Q network algorithm is proposed to optimize the joint action value function of the data center and obtain the optimal strategy. Experiments show that compared with other reinforcement learning methods, our method can not only reduce the energy consumption of the data center, but also reduce the frequency of hot spots. |
---|---|
ISSN: | 1064-1246 1875-8967 |
DOI: | 10.3233/JIFS-223769 |