Hybrid Gradient Descent Grey Wolf Optimizer for Optimal Feature Selection

Feature selection is the process of decreasing the number of features in a dataset by removing redundant, irrelevant, and randomly class-corrected data features. By applying feature selection on large and highly dimensional datasets, the redundant features are removed, reducing the complexity of the...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:BioMed research international 2021, Vol.2021, p.1-33
Hauptverfasser: Kitonyi, Peter Mule, Segera, Davies Rene
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 33
container_issue
container_start_page 1
container_title BioMed research international
container_volume 2021
creator Kitonyi, Peter Mule
Segera, Davies Rene
description Feature selection is the process of decreasing the number of features in a dataset by removing redundant, irrelevant, and randomly class-corrected data features. By applying feature selection on large and highly dimensional datasets, the redundant features are removed, reducing the complexity of the data and reducing training time. The objective of this paper was to design an optimizer that combines the well-known metaheuristic population-based optimizer, the grey wolf algorithm, and the gradient descent algorithm and test it for applications in feature selection problems. The proposed algorithm was first compared against the original grey wolf algorithm in 23 continuous test functions. The proposed optimizer was altered for feature selection, and 3 binary implementations were developed with final implementation compared against the two implementations of the binary grey wolf optimizer and binary grey wolf particle swarm optimizer on 6 medical datasets from the UCI machine learning repository, on metrics such as accuracy, size of feature subsets, F-measure, accuracy, precision, and sensitivity. The proposed optimizer outperformed the three other optimizers in 3 of the 6 datasets in average metrics. The proposed optimizer showed promise in its capability to balance the two objectives in feature selection and could be further enhanced.
doi_str_mv 10.1155/2021/2555622
format Article
fullrecord <record><control><sourceid>proquest_pubme</sourceid><recordid>TN_cdi_pubmedcentral_primary_oai_pubmedcentral_nih_gov_8421169</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2571753887</sourcerecordid><originalsourceid>FETCH-LOGICAL-c425t-b40db352877d387d17070f8d50153cae6180289ae81cd28333b4844fd86cb5183</originalsourceid><addsrcrecordid>eNp9kU1LAzEQhoMottTe_AELXgStzeRjk14EqfYDCj2oeAzZTdambDc1u6vUX--uLYIenMvMMA8v7_AidA74BoDzIcEEhoRzHhNyhLqEAhvEwOD4Z6a0g_plucZNSYjxKD5FHcrYSEgWd9F8tkuCM9E0aONsUUX3tkzbPg12F734PIuW28pt3KcNUebDftN5NLG6qoONHm1u08r54gydZDovbf_Qe-h58vA0ng0Wy-l8fLcYpIzwapAwbBLKiRTCUCkMCCxwJg3HwGmqbQwSEznSVkJqiKSUJkwylhkZpwkHSXvodq-7rZONNa3ZoHO1DY2tsFNeO_X7UriVevXvSjICEI8agcuDQPBvtS0rtXHNz3muC-vrUhEuAHNCRYte_EHXvg5F8943JTiVUjTU9Z5Kgy_LYLMfM4BVG5NqY1KHmBr8ao-vXGH0h_uf_gLc1Y5U</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2571753887</pqid></control><display><type>article</type><title>Hybrid Gradient Descent Grey Wolf Optimizer for Optimal Feature Selection</title><source>PubMed Central Open Access</source><source>Wiley-Blackwell Open Access Titles</source><source>PubMed Central</source><source>Alma/SFX Local Collection</source><creator>Kitonyi, Peter Mule ; Segera, Davies Rene</creator><contributor>Harrison, Paul ; Paul Harrison</contributor><creatorcontrib>Kitonyi, Peter Mule ; Segera, Davies Rene ; Harrison, Paul ; Paul Harrison</creatorcontrib><description>Feature selection is the process of decreasing the number of features in a dataset by removing redundant, irrelevant, and randomly class-corrected data features. By applying feature selection on large and highly dimensional datasets, the redundant features are removed, reducing the complexity of the data and reducing training time. The objective of this paper was to design an optimizer that combines the well-known metaheuristic population-based optimizer, the grey wolf algorithm, and the gradient descent algorithm and test it for applications in feature selection problems. The proposed algorithm was first compared against the original grey wolf algorithm in 23 continuous test functions. The proposed optimizer was altered for feature selection, and 3 binary implementations were developed with final implementation compared against the two implementations of the binary grey wolf optimizer and binary grey wolf particle swarm optimizer on 6 medical datasets from the UCI machine learning repository, on metrics such as accuracy, size of feature subsets, F-measure, accuracy, precision, and sensitivity. The proposed optimizer outperformed the three other optimizers in 3 of the 6 datasets in average metrics. The proposed optimizer showed promise in its capability to balance the two objectives in feature selection and could be further enhanced.</description><identifier>ISSN: 2314-6133</identifier><identifier>EISSN: 2314-6141</identifier><identifier>DOI: 10.1155/2021/2555622</identifier><identifier>PMID: 34497846</identifier><language>eng</language><publisher>New York: Hindawi</publisher><subject>Algorithms ; Big Data ; Business metrics ; Continuity (mathematics) ; Datasets ; Design optimization ; Feature selection ; Genetic algorithms ; Genetic engineering ; Heuristic methods ; Learning algorithms ; Literature reviews ; Machine learning ; Medical research ; Optimization algorithms ; Optimization techniques</subject><ispartof>BioMed research international, 2021, Vol.2021, p.1-33</ispartof><rights>Copyright © 2021 Peter Mule Kitonyi and Davies Rene Segera.</rights><rights>Copyright © 2021 Peter Mule Kitonyi and Davies Rene Segera. This is an open access article distributed under the Creative Commons Attribution License (the “License”), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License. https://creativecommons.org/licenses/by/4.0</rights><rights>Copyright © 2021 Peter Mule Kitonyi and Davies Rene Segera. 2021</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c425t-b40db352877d387d17070f8d50153cae6180289ae81cd28333b4844fd86cb5183</citedby><cites>FETCH-LOGICAL-c425t-b40db352877d387d17070f8d50153cae6180289ae81cd28333b4844fd86cb5183</cites><orcidid>0000-0002-8393-7995 ; 0000-0002-4243-6801</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://www.ncbi.nlm.nih.gov/pmc/articles/PMC8421169/pdf/$$EPDF$$P50$$Gpubmedcentral$$Hfree_for_read</linktopdf><linktohtml>$$Uhttps://www.ncbi.nlm.nih.gov/pmc/articles/PMC8421169/$$EHTML$$P50$$Gpubmedcentral$$Hfree_for_read</linktohtml><link.rule.ids>230,314,727,780,784,885,4024,27923,27924,27925,53791,53793</link.rule.ids></links><search><contributor>Harrison, Paul</contributor><contributor>Paul Harrison</contributor><creatorcontrib>Kitonyi, Peter Mule</creatorcontrib><creatorcontrib>Segera, Davies Rene</creatorcontrib><title>Hybrid Gradient Descent Grey Wolf Optimizer for Optimal Feature Selection</title><title>BioMed research international</title><description>Feature selection is the process of decreasing the number of features in a dataset by removing redundant, irrelevant, and randomly class-corrected data features. By applying feature selection on large and highly dimensional datasets, the redundant features are removed, reducing the complexity of the data and reducing training time. The objective of this paper was to design an optimizer that combines the well-known metaheuristic population-based optimizer, the grey wolf algorithm, and the gradient descent algorithm and test it for applications in feature selection problems. The proposed algorithm was first compared against the original grey wolf algorithm in 23 continuous test functions. The proposed optimizer was altered for feature selection, and 3 binary implementations were developed with final implementation compared against the two implementations of the binary grey wolf optimizer and binary grey wolf particle swarm optimizer on 6 medical datasets from the UCI machine learning repository, on metrics such as accuracy, size of feature subsets, F-measure, accuracy, precision, and sensitivity. The proposed optimizer outperformed the three other optimizers in 3 of the 6 datasets in average metrics. The proposed optimizer showed promise in its capability to balance the two objectives in feature selection and could be further enhanced.</description><subject>Algorithms</subject><subject>Big Data</subject><subject>Business metrics</subject><subject>Continuity (mathematics)</subject><subject>Datasets</subject><subject>Design optimization</subject><subject>Feature selection</subject><subject>Genetic algorithms</subject><subject>Genetic engineering</subject><subject>Heuristic methods</subject><subject>Learning algorithms</subject><subject>Literature reviews</subject><subject>Machine learning</subject><subject>Medical research</subject><subject>Optimization algorithms</subject><subject>Optimization techniques</subject><issn>2314-6133</issn><issn>2314-6141</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2021</creationdate><recordtype>article</recordtype><sourceid>RHX</sourceid><sourceid>ABUWG</sourceid><sourceid>AFKRA</sourceid><sourceid>AZQEC</sourceid><sourceid>BENPR</sourceid><sourceid>CCPQU</sourceid><sourceid>DWQXO</sourceid><sourceid>GNUQQ</sourceid><recordid>eNp9kU1LAzEQhoMottTe_AELXgStzeRjk14EqfYDCj2oeAzZTdambDc1u6vUX--uLYIenMvMMA8v7_AidA74BoDzIcEEhoRzHhNyhLqEAhvEwOD4Z6a0g_plucZNSYjxKD5FHcrYSEgWd9F8tkuCM9E0aONsUUX3tkzbPg12F734PIuW28pt3KcNUebDftN5NLG6qoONHm1u08r54gydZDovbf_Qe-h58vA0ng0Wy-l8fLcYpIzwapAwbBLKiRTCUCkMCCxwJg3HwGmqbQwSEznSVkJqiKSUJkwylhkZpwkHSXvodq-7rZONNa3ZoHO1DY2tsFNeO_X7UriVevXvSjICEI8agcuDQPBvtS0rtXHNz3muC-vrUhEuAHNCRYte_EHXvg5F8943JTiVUjTU9Z5Kgy_LYLMfM4BVG5NqY1KHmBr8ao-vXGH0h_uf_gLc1Y5U</recordid><startdate>2021</startdate><enddate>2021</enddate><creator>Kitonyi, Peter Mule</creator><creator>Segera, Davies Rene</creator><general>Hindawi</general><general>Hindawi Limited</general><scope>RHU</scope><scope>RHW</scope><scope>RHX</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>3V.</scope><scope>7QL</scope><scope>7QO</scope><scope>7T7</scope><scope>7TK</scope><scope>7U7</scope><scope>7U9</scope><scope>7X7</scope><scope>7XB</scope><scope>88E</scope><scope>8FD</scope><scope>8FE</scope><scope>8FG</scope><scope>8FH</scope><scope>8FI</scope><scope>8FJ</scope><scope>8FK</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>AZQEC</scope><scope>BBNVY</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>BHPHI</scope><scope>C1K</scope><scope>CCPQU</scope><scope>CWDGH</scope><scope>DWQXO</scope><scope>FR3</scope><scope>FYUFA</scope><scope>GHDGH</scope><scope>GNUQQ</scope><scope>H94</scope><scope>HCIFZ</scope><scope>K9.</scope><scope>LK8</scope><scope>M0S</scope><scope>M1P</scope><scope>M7N</scope><scope>M7P</scope><scope>P5Z</scope><scope>P62</scope><scope>P64</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>7X8</scope><scope>5PM</scope><orcidid>https://orcid.org/0000-0002-8393-7995</orcidid><orcidid>https://orcid.org/0000-0002-4243-6801</orcidid></search><sort><creationdate>2021</creationdate><title>Hybrid Gradient Descent Grey Wolf Optimizer for Optimal Feature Selection</title><author>Kitonyi, Peter Mule ; Segera, Davies Rene</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c425t-b40db352877d387d17070f8d50153cae6180289ae81cd28333b4844fd86cb5183</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2021</creationdate><topic>Algorithms</topic><topic>Big Data</topic><topic>Business metrics</topic><topic>Continuity (mathematics)</topic><topic>Datasets</topic><topic>Design optimization</topic><topic>Feature selection</topic><topic>Genetic algorithms</topic><topic>Genetic engineering</topic><topic>Heuristic methods</topic><topic>Learning algorithms</topic><topic>Literature reviews</topic><topic>Machine learning</topic><topic>Medical research</topic><topic>Optimization algorithms</topic><topic>Optimization techniques</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Kitonyi, Peter Mule</creatorcontrib><creatorcontrib>Segera, Davies Rene</creatorcontrib><collection>Hindawi Publishing Complete</collection><collection>Hindawi Publishing Subscription Journals</collection><collection>Hindawi Publishing Open Access Journals</collection><collection>CrossRef</collection><collection>ProQuest Central (Corporate)</collection><collection>Bacteriology Abstracts (Microbiology B)</collection><collection>Biotechnology Research Abstracts</collection><collection>Industrial and Applied Microbiology Abstracts (Microbiology A)</collection><collection>Neurosciences Abstracts</collection><collection>Toxicology Abstracts</collection><collection>Virology and AIDS Abstracts</collection><collection>Health &amp; Medical Collection</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>Medical Database (Alumni Edition)</collection><collection>Technology Research Database</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>ProQuest Natural Science Collection</collection><collection>Hospital Premium Collection</collection><collection>Hospital Premium Collection (Alumni Edition)</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>Advanced Technologies &amp; Aerospace Collection</collection><collection>ProQuest Central Essentials</collection><collection>Biological Science Collection</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>Natural Science Collection</collection><collection>Environmental Sciences and Pollution Management</collection><collection>ProQuest One Community College</collection><collection>Middle East &amp; Africa Database</collection><collection>ProQuest Central Korea</collection><collection>Engineering Research Database</collection><collection>Health Research Premium Collection</collection><collection>Health Research Premium Collection (Alumni)</collection><collection>ProQuest Central Student</collection><collection>AIDS and Cancer Research Abstracts</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Health &amp; Medical Complete (Alumni)</collection><collection>ProQuest Biological Science Collection</collection><collection>Health &amp; Medical Collection (Alumni Edition)</collection><collection>Medical Database</collection><collection>Algology Mycology and Protozoology Abstracts (Microbiology C)</collection><collection>Biological Science Database</collection><collection>Advanced Technologies &amp; Aerospace Database</collection><collection>ProQuest Advanced Technologies &amp; Aerospace Collection</collection><collection>Biotechnology and BioEngineering Abstracts</collection><collection>Publicly Available Content Database</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>MEDLINE - Academic</collection><collection>PubMed Central (Full Participant titles)</collection><jtitle>BioMed research international</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Kitonyi, Peter Mule</au><au>Segera, Davies Rene</au><au>Harrison, Paul</au><au>Paul Harrison</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Hybrid Gradient Descent Grey Wolf Optimizer for Optimal Feature Selection</atitle><jtitle>BioMed research international</jtitle><date>2021</date><risdate>2021</risdate><volume>2021</volume><spage>1</spage><epage>33</epage><pages>1-33</pages><issn>2314-6133</issn><eissn>2314-6141</eissn><abstract>Feature selection is the process of decreasing the number of features in a dataset by removing redundant, irrelevant, and randomly class-corrected data features. By applying feature selection on large and highly dimensional datasets, the redundant features are removed, reducing the complexity of the data and reducing training time. The objective of this paper was to design an optimizer that combines the well-known metaheuristic population-based optimizer, the grey wolf algorithm, and the gradient descent algorithm and test it for applications in feature selection problems. The proposed algorithm was first compared against the original grey wolf algorithm in 23 continuous test functions. The proposed optimizer was altered for feature selection, and 3 binary implementations were developed with final implementation compared against the two implementations of the binary grey wolf optimizer and binary grey wolf particle swarm optimizer on 6 medical datasets from the UCI machine learning repository, on metrics such as accuracy, size of feature subsets, F-measure, accuracy, precision, and sensitivity. The proposed optimizer outperformed the three other optimizers in 3 of the 6 datasets in average metrics. The proposed optimizer showed promise in its capability to balance the two objectives in feature selection and could be further enhanced.</abstract><cop>New York</cop><pub>Hindawi</pub><pmid>34497846</pmid><doi>10.1155/2021/2555622</doi><tpages>33</tpages><orcidid>https://orcid.org/0000-0002-8393-7995</orcidid><orcidid>https://orcid.org/0000-0002-4243-6801</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 2314-6133
ispartof BioMed research international, 2021, Vol.2021, p.1-33
issn 2314-6133
2314-6141
language eng
recordid cdi_pubmedcentral_primary_oai_pubmedcentral_nih_gov_8421169
source PubMed Central Open Access; Wiley-Blackwell Open Access Titles; PubMed Central; Alma/SFX Local Collection
subjects Algorithms
Big Data
Business metrics
Continuity (mathematics)
Datasets
Design optimization
Feature selection
Genetic algorithms
Genetic engineering
Heuristic methods
Learning algorithms
Literature reviews
Machine learning
Medical research
Optimization algorithms
Optimization techniques
title Hybrid Gradient Descent Grey Wolf Optimizer for Optimal Feature Selection
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-05T03%3A00%3A55IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_pubme&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Hybrid%20Gradient%20Descent%20Grey%20Wolf%20Optimizer%20for%20Optimal%20Feature%20Selection&rft.jtitle=BioMed%20research%20international&rft.au=Kitonyi,%20Peter%20Mule&rft.date=2021&rft.volume=2021&rft.spage=1&rft.epage=33&rft.pages=1-33&rft.issn=2314-6133&rft.eissn=2314-6141&rft_id=info:doi/10.1155/2021/2555622&rft_dat=%3Cproquest_pubme%3E2571753887%3C/proquest_pubme%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2571753887&rft_id=info:pmid/34497846&rfr_iscdi=true