A novel approach for CPU load prediction of cloud server combining denoising and error correction

Computer servers in cloud data centers are known to consume a huge amount of energy in their operations. For energy saving, load balancing has been used but it is only effective when CPU loads are predicted accurately. Noise in the energy consumption data is often a detrimental factor responsible fo...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Computing 2023-03, Vol.105 (3), p.577-594
Hauptverfasser: You, Deguang, Lin, Weiwei, Shi, Fang, Li, Jianzhuo, Qi, Deyu, Fong, Simon
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Computer servers in cloud data centers are known to consume a huge amount of energy in their operations. For energy saving, load balancing has been used but it is only effective when CPU loads are predicted accurately. Noise in the energy consumption data is often a detrimental factor responsible for the CPU load prediction error. In prior arts, denoising has not been considered as an approach to subside the prediction error. Therefore, a novel prediction approach called CEEMDAN-RIDGE that is centered on denoising is proposed and reported in this paper. Firstly, CEEMDAN is applied to decompose the CPU consumption data which is in the form of a time series. The curvature similarity between a pair of the original series and its corresponding decomposed series is measured. By referencing to this similarity measure, an effective series is obtained from filtration of the noise series. The effective series after the noise series is filtered out is reconstructed to a new fitting curve for CPU load prediction. The prediction accuracy is further enhanced by doing some error correction called RIDGE, which is made possible by predicting the error a priori from the historical error data from the previous prediction. In order to validate CEEMDAN-RIDGE, a series of experiments are conducted with Google trace data. The experiment results show that the LSTM model using the proposed CPU load prediction approach outperforms other models significantly in three performance metrics: RMSE, MAE and MAPE.
ISSN:0010-485X
1436-5057
DOI:10.1007/s00607-020-00865-y