Hyperparameter Search for Machine Learning Algorithms for Optimizing the Computational Complexity

For machine learning algorithms, fine-tuning hyperparameters is a computational challenge due to the large size of the problem space. An efficient strategy for adjusting hyperparameters can be established with the use of the greedy search and Swarm intelligence algorithms. The Random Search and Grid...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Processes 2023-02, Vol.11 (2), p.349
Hauptverfasser: Ali, Yasser, Awwad, Emad, Al-Razgan, Muna, Maarouf, Ali
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue 2
container_start_page 349
container_title Processes
container_volume 11
creator Ali, Yasser
Awwad, Emad
Al-Razgan, Muna
Maarouf, Ali
description For machine learning algorithms, fine-tuning hyperparameters is a computational challenge due to the large size of the problem space. An efficient strategy for adjusting hyperparameters can be established with the use of the greedy search and Swarm intelligence algorithms. The Random Search and Grid Search optimization techniques show promise and efficiency for this task. The small population of solutions used at the outset, and the costly goal functions used by these searches, can lead to slow convergence or execution time in some cases. In this research, we propose using the machine learning model known as Support Vector Machine and optimizing it using four distinct algorithms—the Ant Bee Colony Algorithm, the Genetic Algorithm, the Whale Optimization, and the Particle Swarm Optimization—to evaluate the computational cost of SVM after hyper-tuning. Computational complexity comparisons of these optimization algorithms were performed to determine the most effective strategies for hyperparameter tuning. It was found that the Genetic Algorithm had a lower temporal complexity than other algorithms.
doi_str_mv 10.3390/pr11020349
format Article
fullrecord <record><control><sourceid>gale_proqu</sourceid><recordid>TN_cdi_proquest_journals_2779651902</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><galeid>A742893447</galeid><sourcerecordid>A742893447</sourcerecordid><originalsourceid>FETCH-LOGICAL-c334t-2c27775a7bf0421fb936a44048d293ff81d787aba5ff9f057ee7cfdca9bd07ad3</originalsourceid><addsrcrecordid>eNpNUE1LAzEQDaJgqb34Cxa8Ca352mZzLEWtUOlBPS-z2aSbsrtZkxSsv960FXTmMDPvvRmGh9AtwTPGJH4YPCGYYsblBRpRSsVUCiIu__XXaBLCDqeQhBX5fIRgdRi0H8BDp6P22ZsGr5rMOJ-9gmpsr7N1gnrbb7NFu3XexqYLJ34zRNvZ7yMTG50tXTfsI0TremhPU6u_bDzcoCsDbdCT3zpGH0-P78vVdL15flku1lPFGI9TqqgQIgdRGcwpMZVkc-Ac86KmkhlTkFoUAirIjZEG50JroUytQFY1FlCzMbo73x28-9zrEMud2_v0SyjTZTnPicQ0qWZn1RZaXdreuOhBpax1Z5XrtbEJXwhOC8k4F2nh_rygvAvBa1MO3nbgDyXB5dH28s929gOdZXYh</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2779651902</pqid></control><display><type>article</type><title>Hyperparameter Search for Machine Learning Algorithms for Optimizing the Computational Complexity</title><source>Elektronische Zeitschriftenbibliothek - Frei zugängliche E-Journals</source><source>MDPI - Multidisciplinary Digital Publishing Institute</source><creator>Ali, Yasser ; Awwad, Emad ; Al-Razgan, Muna ; Maarouf, Ali</creator><creatorcontrib>Ali, Yasser ; Awwad, Emad ; Al-Razgan, Muna ; Maarouf, Ali</creatorcontrib><description>For machine learning algorithms, fine-tuning hyperparameters is a computational challenge due to the large size of the problem space. An efficient strategy for adjusting hyperparameters can be established with the use of the greedy search and Swarm intelligence algorithms. The Random Search and Grid Search optimization techniques show promise and efficiency for this task. The small population of solutions used at the outset, and the costly goal functions used by these searches, can lead to slow convergence or execution time in some cases. In this research, we propose using the machine learning model known as Support Vector Machine and optimizing it using four distinct algorithms—the Ant Bee Colony Algorithm, the Genetic Algorithm, the Whale Optimization, and the Particle Swarm Optimization—to evaluate the computational cost of SVM after hyper-tuning. Computational complexity comparisons of these optimization algorithms were performed to determine the most effective strategies for hyperparameter tuning. It was found that the Genetic Algorithm had a lower temporal complexity than other algorithms.</description><identifier>ISSN: 2227-9717</identifier><identifier>EISSN: 2227-9717</identifier><identifier>DOI: 10.3390/pr11020349</identifier><language>eng</language><publisher>Basel: MDPI AG</publisher><subject>Accuracy ; Algorithms ; Analysis ; Artificial intelligence ; Classification ; Complexity ; Computer applications ; Computing costs ; Data mining ; Datasets ; Genetic algorithms ; Learning algorithms ; Machine learning ; Mathematical optimization ; Methods ; Neural networks ; Optimization algorithms ; Particle swarm optimization ; Researchers ; Search algorithms ; Searches and seizures ; Sensitivity analysis ; Software ; Support vector machines ; Swarm intelligence</subject><ispartof>Processes, 2023-02, Vol.11 (2), p.349</ispartof><rights>COPYRIGHT 2023 MDPI AG</rights><rights>2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c334t-2c27775a7bf0421fb936a44048d293ff81d787aba5ff9f057ee7cfdca9bd07ad3</citedby><cites>FETCH-LOGICAL-c334t-2c27775a7bf0421fb936a44048d293ff81d787aba5ff9f057ee7cfdca9bd07ad3</cites><orcidid>0000-0002-9503-0637 ; 0000-0002-2438-1106 ; 0000-0002-8115-4882 ; 0000-0002-9705-3867</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>315,781,785,27929,27930</link.rule.ids></links><search><creatorcontrib>Ali, Yasser</creatorcontrib><creatorcontrib>Awwad, Emad</creatorcontrib><creatorcontrib>Al-Razgan, Muna</creatorcontrib><creatorcontrib>Maarouf, Ali</creatorcontrib><title>Hyperparameter Search for Machine Learning Algorithms for Optimizing the Computational Complexity</title><title>Processes</title><description>For machine learning algorithms, fine-tuning hyperparameters is a computational challenge due to the large size of the problem space. An efficient strategy for adjusting hyperparameters can be established with the use of the greedy search and Swarm intelligence algorithms. The Random Search and Grid Search optimization techniques show promise and efficiency for this task. The small population of solutions used at the outset, and the costly goal functions used by these searches, can lead to slow convergence or execution time in some cases. In this research, we propose using the machine learning model known as Support Vector Machine and optimizing it using four distinct algorithms—the Ant Bee Colony Algorithm, the Genetic Algorithm, the Whale Optimization, and the Particle Swarm Optimization—to evaluate the computational cost of SVM after hyper-tuning. Computational complexity comparisons of these optimization algorithms were performed to determine the most effective strategies for hyperparameter tuning. It was found that the Genetic Algorithm had a lower temporal complexity than other algorithms.</description><subject>Accuracy</subject><subject>Algorithms</subject><subject>Analysis</subject><subject>Artificial intelligence</subject><subject>Classification</subject><subject>Complexity</subject><subject>Computer applications</subject><subject>Computing costs</subject><subject>Data mining</subject><subject>Datasets</subject><subject>Genetic algorithms</subject><subject>Learning algorithms</subject><subject>Machine learning</subject><subject>Mathematical optimization</subject><subject>Methods</subject><subject>Neural networks</subject><subject>Optimization algorithms</subject><subject>Particle swarm optimization</subject><subject>Researchers</subject><subject>Search algorithms</subject><subject>Searches and seizures</subject><subject>Sensitivity analysis</subject><subject>Software</subject><subject>Support vector machines</subject><subject>Swarm intelligence</subject><issn>2227-9717</issn><issn>2227-9717</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><sourceid>ABUWG</sourceid><sourceid>AFKRA</sourceid><sourceid>AZQEC</sourceid><sourceid>BENPR</sourceid><sourceid>CCPQU</sourceid><sourceid>DWQXO</sourceid><sourceid>GNUQQ</sourceid><recordid>eNpNUE1LAzEQDaJgqb34Cxa8Ca352mZzLEWtUOlBPS-z2aSbsrtZkxSsv960FXTmMDPvvRmGh9AtwTPGJH4YPCGYYsblBRpRSsVUCiIu__XXaBLCDqeQhBX5fIRgdRi0H8BDp6P22ZsGr5rMOJ-9gmpsr7N1gnrbb7NFu3XexqYLJ34zRNvZ7yMTG50tXTfsI0TremhPU6u_bDzcoCsDbdCT3zpGH0-P78vVdL15flku1lPFGI9TqqgQIgdRGcwpMZVkc-Ac86KmkhlTkFoUAirIjZEG50JroUytQFY1FlCzMbo73x28-9zrEMud2_v0SyjTZTnPicQ0qWZn1RZaXdreuOhBpax1Z5XrtbEJXwhOC8k4F2nh_rygvAvBa1MO3nbgDyXB5dH28s929gOdZXYh</recordid><startdate>20230201</startdate><enddate>20230201</enddate><creator>Ali, Yasser</creator><creator>Awwad, Emad</creator><creator>Al-Razgan, Muna</creator><creator>Maarouf, Ali</creator><general>MDPI AG</general><scope>AAYXX</scope><scope>CITATION</scope><scope>7SR</scope><scope>8FD</scope><scope>8FE</scope><scope>8FG</scope><scope>8FH</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BBNVY</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>BHPHI</scope><scope>CCPQU</scope><scope>D1I</scope><scope>DWQXO</scope><scope>GNUQQ</scope><scope>HCIFZ</scope><scope>JG9</scope><scope>KB.</scope><scope>LK8</scope><scope>M7P</scope><scope>PDBOC</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><orcidid>https://orcid.org/0000-0002-9503-0637</orcidid><orcidid>https://orcid.org/0000-0002-2438-1106</orcidid><orcidid>https://orcid.org/0000-0002-8115-4882</orcidid><orcidid>https://orcid.org/0000-0002-9705-3867</orcidid></search><sort><creationdate>20230201</creationdate><title>Hyperparameter Search for Machine Learning Algorithms for Optimizing the Computational Complexity</title><author>Ali, Yasser ; Awwad, Emad ; Al-Razgan, Muna ; Maarouf, Ali</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c334t-2c27775a7bf0421fb936a44048d293ff81d787aba5ff9f057ee7cfdca9bd07ad3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><topic>Accuracy</topic><topic>Algorithms</topic><topic>Analysis</topic><topic>Artificial intelligence</topic><topic>Classification</topic><topic>Complexity</topic><topic>Computer applications</topic><topic>Computing costs</topic><topic>Data mining</topic><topic>Datasets</topic><topic>Genetic algorithms</topic><topic>Learning algorithms</topic><topic>Machine learning</topic><topic>Mathematical optimization</topic><topic>Methods</topic><topic>Neural networks</topic><topic>Optimization algorithms</topic><topic>Particle swarm optimization</topic><topic>Researchers</topic><topic>Search algorithms</topic><topic>Searches and seizures</topic><topic>Sensitivity analysis</topic><topic>Software</topic><topic>Support vector machines</topic><topic>Swarm intelligence</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Ali, Yasser</creatorcontrib><creatorcontrib>Awwad, Emad</creatorcontrib><creatorcontrib>Al-Razgan, Muna</creatorcontrib><creatorcontrib>Maarouf, Ali</creatorcontrib><collection>CrossRef</collection><collection>Engineered Materials Abstracts</collection><collection>Technology Research Database</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>ProQuest Natural Science Collection</collection><collection>Materials Science &amp; Engineering Collection</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>ProQuest Central Essentials</collection><collection>Biological Science Collection</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>Natural Science Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Materials Science Collection</collection><collection>ProQuest Central Korea</collection><collection>ProQuest Central Student</collection><collection>SciTech Premium Collection</collection><collection>Materials Research Database</collection><collection>Materials Science Database</collection><collection>ProQuest Biological Science Collection</collection><collection>Biological Science Database</collection><collection>Materials Science Collection</collection><collection>Publicly Available Content Database</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><jtitle>Processes</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Ali, Yasser</au><au>Awwad, Emad</au><au>Al-Razgan, Muna</au><au>Maarouf, Ali</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Hyperparameter Search for Machine Learning Algorithms for Optimizing the Computational Complexity</atitle><jtitle>Processes</jtitle><date>2023-02-01</date><risdate>2023</risdate><volume>11</volume><issue>2</issue><spage>349</spage><pages>349-</pages><issn>2227-9717</issn><eissn>2227-9717</eissn><abstract>For machine learning algorithms, fine-tuning hyperparameters is a computational challenge due to the large size of the problem space. An efficient strategy for adjusting hyperparameters can be established with the use of the greedy search and Swarm intelligence algorithms. The Random Search and Grid Search optimization techniques show promise and efficiency for this task. The small population of solutions used at the outset, and the costly goal functions used by these searches, can lead to slow convergence or execution time in some cases. In this research, we propose using the machine learning model known as Support Vector Machine and optimizing it using four distinct algorithms—the Ant Bee Colony Algorithm, the Genetic Algorithm, the Whale Optimization, and the Particle Swarm Optimization—to evaluate the computational cost of SVM after hyper-tuning. Computational complexity comparisons of these optimization algorithms were performed to determine the most effective strategies for hyperparameter tuning. It was found that the Genetic Algorithm had a lower temporal complexity than other algorithms.</abstract><cop>Basel</cop><pub>MDPI AG</pub><doi>10.3390/pr11020349</doi><orcidid>https://orcid.org/0000-0002-9503-0637</orcidid><orcidid>https://orcid.org/0000-0002-2438-1106</orcidid><orcidid>https://orcid.org/0000-0002-8115-4882</orcidid><orcidid>https://orcid.org/0000-0002-9705-3867</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 2227-9717
ispartof Processes, 2023-02, Vol.11 (2), p.349
issn 2227-9717
2227-9717
language eng
recordid cdi_proquest_journals_2779651902
source Elektronische Zeitschriftenbibliothek - Frei zugängliche E-Journals; MDPI - Multidisciplinary Digital Publishing Institute
subjects Accuracy
Algorithms
Analysis
Artificial intelligence
Classification
Complexity
Computer applications
Computing costs
Data mining
Datasets
Genetic algorithms
Learning algorithms
Machine learning
Mathematical optimization
Methods
Neural networks
Optimization algorithms
Particle swarm optimization
Researchers
Search algorithms
Searches and seizures
Sensitivity analysis
Software
Support vector machines
Swarm intelligence
title Hyperparameter Search for Machine Learning Algorithms for Optimizing the Computational Complexity
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-15T10%3A11%3A40IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-gale_proqu&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Hyperparameter%20Search%20for%20Machine%20Learning%20Algorithms%20for%20Optimizing%20the%20Computational%20Complexity&rft.jtitle=Processes&rft.au=Ali,%20Yasser&rft.date=2023-02-01&rft.volume=11&rft.issue=2&rft.spage=349&rft.pages=349-&rft.issn=2227-9717&rft.eissn=2227-9717&rft_id=info:doi/10.3390/pr11020349&rft_dat=%3Cgale_proqu%3EA742893447%3C/gale_proqu%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2779651902&rft_id=info:pmid/&rft_galeid=A742893447&rfr_iscdi=true