An Energy-Saving Task Scheduling Model via Greedy Strategy under Cloud Environment

Cloud computing, an emerging computing paradigm, has been widely concerned due to its high scalability and availability. An essential stage of cloud computing is cloud resource management. Currently, the existing research about cloud computing technology has two prevalent disadvantages: high energy...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Wireless communications and mobile computing 2022-04, Vol.2022, p.1-13
Hauptverfasser: Liu, Shuaishuai, Ma, Xinyu, Jia, Yuanfei, Liu, Yue
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Cloud computing, an emerging computing paradigm, has been widely concerned due to its high scalability and availability. An essential stage of cloud computing is cloud resource management. Currently, the existing research about cloud computing technology has two prevalent disadvantages: high energy consumption and low resource utilization. Considering greedy scheduling is an effective strategy for cloud resource management technology in cloud computing, particularly in improving resource utilization and reducing energy consumption, we consider the heterogeneous characteristics of resources to save energy consumption of datacenter when tasks are the fundamental element of cloud datacenter. Meanwhile, granular computing is a complex problem-solving strategy through a granulation method. Thus, we introduce granular computing theory into cloud task scheduling and propose a greedy scheduling strategy based on different information granules, dividing the tasks into three types (i.e., CPU, memory, and hybrid type). Finally, we assign various scheduling strategies for cloud tasks with different characteristics. All the numerical experiments on the CloudSim platform show that our method has significant effects on energy consumption optimization and is a practical task scheduling algorithm.
ISSN:1530-8669
1530-8677
DOI:10.1155/2022/8769674