A Hyperparameter Optimization Algorithm for the LSTM Temperature Prediction Model in Data Center

As the main tool to realize data mining and efficient knowledge acquisition in the era of big data, machine learning is widely used in data center energy-saving research. The temperature prediction model based on machine learning predicts the state of the data center according to the upcoming tasks....

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Scientific programming 2022-12, Vol.2022, p.1-13
Hauptverfasser: Wang, Simin, Ma, Chunmiao, Xu, Yixuan, Wang, Jinyu, Wu, Weiguo
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 13
container_issue
container_start_page 1
container_title Scientific programming
container_volume 2022
creator Wang, Simin
Ma, Chunmiao
Xu, Yixuan
Wang, Jinyu
Wu, Weiguo
description As the main tool to realize data mining and efficient knowledge acquisition in the era of big data, machine learning is widely used in data center energy-saving research. The temperature prediction model based on machine learning predicts the state of the data center according to the upcoming tasks. It can adjust the refrigeration equipment in advance to avoid temperature regulation lag and set the air conditioning temperature according to the actual demand to avoid excessive refrigeration. Task scheduling and migration algorithm based on temperature prediction can effectively avoid hot spots. However, the choice of hyperparameter of machine learning model has a great impact on its performance. In this study, a hyperparameter optimization algorithm based on MLP is proposed. On the basis of trying certain hyperparameters, the MLP model is used to predict the value of all hyperparameters’ space, and then, a certain number of high-quality hyperparameters are selected to train the model repeatedly. In each iteration, the amount of training data decreases gradually, while the accuracy of the model improves rapidly, and finally, the appropriate hyperparameter are obtained. We use the idea of mutation in the genetic algorithm to improve the probability of high-quality solutions and the loss function weighting method to select the solution with the best stability. Experiments are carried out on two representative machine learning models, LSTM and Random Forest, and compared with the standard Gaussian Bayes and Random Search method. The results show that the method proposed in this study can obtain high-precision and high-stability hyperparameter through one run and can greatly improve the operation efficiency. This algorithm is not only effective for LSTM but also suitable for other machine learning models.
doi_str_mv 10.1155/2022/6519909
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2758027075</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2758027075</sourcerecordid><originalsourceid>FETCH-LOGICAL-c337t-da76992dc5fd95830e39c5b23f042deda0e722bc7dc7c5064972f8ea8df796bb3</originalsourceid><addsrcrecordid>eNp90E1LAzEQBuAgCtbqzR8Q8Kirk-xmszmW-lGhpYIVvK3ZZNamdD_Mpkj99W5tz55mDs-8Ay8hlwxuGRPijgPnd6lgSoE6IgOWSREppt6P-x1EFimeJKfkrOtWACxjAAPyMaKTbYu-1V5XGNDTeRtc5X50cE1NR-vPxruwrGjZeBqWSKevixldYNXf6LDxSF88Wmf-9KyxuKaupvc6aDrGus87JyelXnd4cZhD8vb4sBhPoun86Xk8mkYmjmWIrJapUtwaUVolshgwVkYUPC4h4RatBpScF0ZaI42ANFGSlxnqzJZSpUURD8nVPrf1zdcGu5Cvmo2v-5c5lyIDLkGKXt3slfFN13ks89a7SvttziDfdZjvOswPHfb8es-Xrrb62_2vfwEjInE-</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2758027075</pqid></control><display><type>article</type><title>A Hyperparameter Optimization Algorithm for the LSTM Temperature Prediction Model in Data Center</title><source>Elektronische Zeitschriftenbibliothek - Frei zugängliche E-Journals</source><source>Wiley-Blackwell Open Access Titles</source><source>Alma/SFX Local Collection</source><creator>Wang, Simin ; Ma, Chunmiao ; Xu, Yixuan ; Wang, Jinyu ; Wu, Weiguo</creator><contributor>Gou, Jianping ; Jianping Gou</contributor><creatorcontrib>Wang, Simin ; Ma, Chunmiao ; Xu, Yixuan ; Wang, Jinyu ; Wu, Weiguo ; Gou, Jianping ; Jianping Gou</creatorcontrib><description>As the main tool to realize data mining and efficient knowledge acquisition in the era of big data, machine learning is widely used in data center energy-saving research. The temperature prediction model based on machine learning predicts the state of the data center according to the upcoming tasks. It can adjust the refrigeration equipment in advance to avoid temperature regulation lag and set the air conditioning temperature according to the actual demand to avoid excessive refrigeration. Task scheduling and migration algorithm based on temperature prediction can effectively avoid hot spots. However, the choice of hyperparameter of machine learning model has a great impact on its performance. In this study, a hyperparameter optimization algorithm based on MLP is proposed. On the basis of trying certain hyperparameters, the MLP model is used to predict the value of all hyperparameters’ space, and then, a certain number of high-quality hyperparameters are selected to train the model repeatedly. In each iteration, the amount of training data decreases gradually, while the accuracy of the model improves rapidly, and finally, the appropriate hyperparameter are obtained. We use the idea of mutation in the genetic algorithm to improve the probability of high-quality solutions and the loss function weighting method to select the solution with the best stability. Experiments are carried out on two representative machine learning models, LSTM and Random Forest, and compared with the standard Gaussian Bayes and Random Search method. The results show that the method proposed in this study can obtain high-precision and high-stability hyperparameter through one run and can greatly improve the operation efficiency. This algorithm is not only effective for LSTM but also suitable for other machine learning models.</description><identifier>ISSN: 1058-9244</identifier><identifier>EISSN: 1875-919X</identifier><identifier>DOI: 10.1155/2022/6519909</identifier><language>eng</language><publisher>New York: Hindawi</publisher><subject>Air conditioning ; Algorithms ; Big Data ; Computer centers ; Data centers ; Data mining ; Energy consumption ; Genetic algorithms ; Heat ; Iterative methods ; Knowledge acquisition ; Load ; Machine learning ; Model accuracy ; Neural networks ; Optimization ; Prediction models ; Random search method ; Refrigeration ; Servers ; Stability ; Task scheduling ; Time series ; Weighting methods</subject><ispartof>Scientific programming, 2022-12, Vol.2022, p.1-13</ispartof><rights>Copyright © 2022 Simin Wang et al.</rights><rights>Copyright © 2022 Simin Wang et al. This is an open access article distributed under the Creative Commons Attribution License (the “License”), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License. https://creativecommons.org/licenses/by/4.0</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c337t-da76992dc5fd95830e39c5b23f042deda0e722bc7dc7c5064972f8ea8df796bb3</citedby><cites>FETCH-LOGICAL-c337t-da76992dc5fd95830e39c5b23f042deda0e722bc7dc7c5064972f8ea8df796bb3</cites><orcidid>0000-0002-1179-3435 ; 0000-0003-3925-1600</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>314,777,781,27905,27906</link.rule.ids></links><search><contributor>Gou, Jianping</contributor><contributor>Jianping Gou</contributor><creatorcontrib>Wang, Simin</creatorcontrib><creatorcontrib>Ma, Chunmiao</creatorcontrib><creatorcontrib>Xu, Yixuan</creatorcontrib><creatorcontrib>Wang, Jinyu</creatorcontrib><creatorcontrib>Wu, Weiguo</creatorcontrib><title>A Hyperparameter Optimization Algorithm for the LSTM Temperature Prediction Model in Data Center</title><title>Scientific programming</title><description>As the main tool to realize data mining and efficient knowledge acquisition in the era of big data, machine learning is widely used in data center energy-saving research. The temperature prediction model based on machine learning predicts the state of the data center according to the upcoming tasks. It can adjust the refrigeration equipment in advance to avoid temperature regulation lag and set the air conditioning temperature according to the actual demand to avoid excessive refrigeration. Task scheduling and migration algorithm based on temperature prediction can effectively avoid hot spots. However, the choice of hyperparameter of machine learning model has a great impact on its performance. In this study, a hyperparameter optimization algorithm based on MLP is proposed. On the basis of trying certain hyperparameters, the MLP model is used to predict the value of all hyperparameters’ space, and then, a certain number of high-quality hyperparameters are selected to train the model repeatedly. In each iteration, the amount of training data decreases gradually, while the accuracy of the model improves rapidly, and finally, the appropriate hyperparameter are obtained. We use the idea of mutation in the genetic algorithm to improve the probability of high-quality solutions and the loss function weighting method to select the solution with the best stability. Experiments are carried out on two representative machine learning models, LSTM and Random Forest, and compared with the standard Gaussian Bayes and Random Search method. The results show that the method proposed in this study can obtain high-precision and high-stability hyperparameter through one run and can greatly improve the operation efficiency. This algorithm is not only effective for LSTM but also suitable for other machine learning models.</description><subject>Air conditioning</subject><subject>Algorithms</subject><subject>Big Data</subject><subject>Computer centers</subject><subject>Data centers</subject><subject>Data mining</subject><subject>Energy consumption</subject><subject>Genetic algorithms</subject><subject>Heat</subject><subject>Iterative methods</subject><subject>Knowledge acquisition</subject><subject>Load</subject><subject>Machine learning</subject><subject>Model accuracy</subject><subject>Neural networks</subject><subject>Optimization</subject><subject>Prediction models</subject><subject>Random search method</subject><subject>Refrigeration</subject><subject>Servers</subject><subject>Stability</subject><subject>Task scheduling</subject><subject>Time series</subject><subject>Weighting methods</subject><issn>1058-9244</issn><issn>1875-919X</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2022</creationdate><recordtype>article</recordtype><sourceid>RHX</sourceid><recordid>eNp90E1LAzEQBuAgCtbqzR8Q8Kirk-xmszmW-lGhpYIVvK3ZZNamdD_Mpkj99W5tz55mDs-8Ay8hlwxuGRPijgPnd6lgSoE6IgOWSREppt6P-x1EFimeJKfkrOtWACxjAAPyMaKTbYu-1V5XGNDTeRtc5X50cE1NR-vPxruwrGjZeBqWSKevixldYNXf6LDxSF88Wmf-9KyxuKaupvc6aDrGus87JyelXnd4cZhD8vb4sBhPoun86Xk8mkYmjmWIrJapUtwaUVolshgwVkYUPC4h4RatBpScF0ZaI42ANFGSlxnqzJZSpUURD8nVPrf1zdcGu5Cvmo2v-5c5lyIDLkGKXt3slfFN13ks89a7SvttziDfdZjvOswPHfb8es-Xrrb62_2vfwEjInE-</recordid><startdate>20221212</startdate><enddate>20221212</enddate><creator>Wang, Simin</creator><creator>Ma, Chunmiao</creator><creator>Xu, Yixuan</creator><creator>Wang, Jinyu</creator><creator>Wu, Weiguo</creator><general>Hindawi</general><general>Hindawi Limited</general><scope>RHU</scope><scope>RHW</scope><scope>RHX</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>7SP</scope><scope>8FD</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><orcidid>https://orcid.org/0000-0002-1179-3435</orcidid><orcidid>https://orcid.org/0000-0003-3925-1600</orcidid></search><sort><creationdate>20221212</creationdate><title>A Hyperparameter Optimization Algorithm for the LSTM Temperature Prediction Model in Data Center</title><author>Wang, Simin ; Ma, Chunmiao ; Xu, Yixuan ; Wang, Jinyu ; Wu, Weiguo</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c337t-da76992dc5fd95830e39c5b23f042deda0e722bc7dc7c5064972f8ea8df796bb3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2022</creationdate><topic>Air conditioning</topic><topic>Algorithms</topic><topic>Big Data</topic><topic>Computer centers</topic><topic>Data centers</topic><topic>Data mining</topic><topic>Energy consumption</topic><topic>Genetic algorithms</topic><topic>Heat</topic><topic>Iterative methods</topic><topic>Knowledge acquisition</topic><topic>Load</topic><topic>Machine learning</topic><topic>Model accuracy</topic><topic>Neural networks</topic><topic>Optimization</topic><topic>Prediction models</topic><topic>Random search method</topic><topic>Refrigeration</topic><topic>Servers</topic><topic>Stability</topic><topic>Task scheduling</topic><topic>Time series</topic><topic>Weighting methods</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Wang, Simin</creatorcontrib><creatorcontrib>Ma, Chunmiao</creatorcontrib><creatorcontrib>Xu, Yixuan</creatorcontrib><creatorcontrib>Wang, Jinyu</creatorcontrib><creatorcontrib>Wu, Weiguo</creatorcontrib><collection>Hindawi Publishing Complete</collection><collection>Hindawi Publishing Subscription Journals</collection><collection>Hindawi Publishing Open Access</collection><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Electronics &amp; Communications Abstracts</collection><collection>Technology Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><jtitle>Scientific programming</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Wang, Simin</au><au>Ma, Chunmiao</au><au>Xu, Yixuan</au><au>Wang, Jinyu</au><au>Wu, Weiguo</au><au>Gou, Jianping</au><au>Jianping Gou</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>A Hyperparameter Optimization Algorithm for the LSTM Temperature Prediction Model in Data Center</atitle><jtitle>Scientific programming</jtitle><date>2022-12-12</date><risdate>2022</risdate><volume>2022</volume><spage>1</spage><epage>13</epage><pages>1-13</pages><issn>1058-9244</issn><eissn>1875-919X</eissn><abstract>As the main tool to realize data mining and efficient knowledge acquisition in the era of big data, machine learning is widely used in data center energy-saving research. The temperature prediction model based on machine learning predicts the state of the data center according to the upcoming tasks. It can adjust the refrigeration equipment in advance to avoid temperature regulation lag and set the air conditioning temperature according to the actual demand to avoid excessive refrigeration. Task scheduling and migration algorithm based on temperature prediction can effectively avoid hot spots. However, the choice of hyperparameter of machine learning model has a great impact on its performance. In this study, a hyperparameter optimization algorithm based on MLP is proposed. On the basis of trying certain hyperparameters, the MLP model is used to predict the value of all hyperparameters’ space, and then, a certain number of high-quality hyperparameters are selected to train the model repeatedly. In each iteration, the amount of training data decreases gradually, while the accuracy of the model improves rapidly, and finally, the appropriate hyperparameter are obtained. We use the idea of mutation in the genetic algorithm to improve the probability of high-quality solutions and the loss function weighting method to select the solution with the best stability. Experiments are carried out on two representative machine learning models, LSTM and Random Forest, and compared with the standard Gaussian Bayes and Random Search method. The results show that the method proposed in this study can obtain high-precision and high-stability hyperparameter through one run and can greatly improve the operation efficiency. This algorithm is not only effective for LSTM but also suitable for other machine learning models.</abstract><cop>New York</cop><pub>Hindawi</pub><doi>10.1155/2022/6519909</doi><tpages>13</tpages><orcidid>https://orcid.org/0000-0002-1179-3435</orcidid><orcidid>https://orcid.org/0000-0003-3925-1600</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 1058-9244
ispartof Scientific programming, 2022-12, Vol.2022, p.1-13
issn 1058-9244
1875-919X
language eng
recordid cdi_proquest_journals_2758027075
source Elektronische Zeitschriftenbibliothek - Frei zugängliche E-Journals; Wiley-Blackwell Open Access Titles; Alma/SFX Local Collection
subjects Air conditioning
Algorithms
Big Data
Computer centers
Data centers
Data mining
Energy consumption
Genetic algorithms
Heat
Iterative methods
Knowledge acquisition
Load
Machine learning
Model accuracy
Neural networks
Optimization
Prediction models
Random search method
Refrigeration
Servers
Stability
Task scheduling
Time series
Weighting methods
title A Hyperparameter Optimization Algorithm for the LSTM Temperature Prediction Model in Data Center
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-18T14%3A43%3A36IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=A%20Hyperparameter%20Optimization%20Algorithm%20for%20the%20LSTM%20Temperature%20Prediction%20Model%20in%20Data%20Center&rft.jtitle=Scientific%20programming&rft.au=Wang,%20Simin&rft.date=2022-12-12&rft.volume=2022&rft.spage=1&rft.epage=13&rft.pages=1-13&rft.issn=1058-9244&rft.eissn=1875-919X&rft_id=info:doi/10.1155/2022/6519909&rft_dat=%3Cproquest_cross%3E2758027075%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2758027075&rft_id=info:pmid/&rfr_iscdi=true