Adaptive K-NN metric classification based on improved Kepler optimization algorithm

K-nearest neighbor (K-NN) method has been widely utilized in data mining and pattern recognition due to its elegant geometric basis and well-defined statistical characteristics. The nearest neighbor number K and the distance metric employed by K-NN have a great impact on the classification performan...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:The Journal of supercomputing 2025, Vol.81 (1), Article 66
Hauptverfasser: Cai, Liang, Zhao, Shijie, Meng, Fanshuai, Zhang, Tianran
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue 1
container_start_page
container_title The Journal of supercomputing
container_volume 81
creator Cai, Liang
Zhao, Shijie
Meng, Fanshuai
Zhang, Tianran
description K-nearest neighbor (K-NN) method has been widely utilized in data mining and pattern recognition due to its elegant geometric basis and well-defined statistical characteristics. The nearest neighbor number K and the distance metric employed by K-NN have a great impact on the classification performance. To effectively improve the K-NN classification capacity, this article proposes an improved Kepler optimization algorithm (termed IKOA) with the interaction effect renewal mechanism and the dynamic rebalancing steady state mechanism for enhancing the global exploration and local exploitation. Furthermore, the IKOA-KNN algorithm, a novel K-NN variant, is constructed by integrating IKOA and K-NN for reinforcing the classification accuracy of the canonical K-NN. In the classification process, the proposed IKOA is used to optimize the adaptive K value of K-NN and the well-posed p -norm value of the distance metric, synchronously. In addition, feature selection based on IKOA is used to eliminate irrelevant and redundant features, so as to maintain or enhance classification accuracy. The experimental results of IKOA on CEC-2017 benchmark functions show that the proposed algorithm, compared with seven well-known meta-heuristic algorithms (i.e., WSO, BWO, DMOA, FLA, RSA, SPO, KOA), the proposed algorithm enhances the exploitation and exploration ability, improves the convergence speed and is more stable when solving optimization problems. Fourteen bi-and multi-class datasets from the UCI Machine Learning Repository are used to demonstrate the effectiveness of the proposed IKOA-KNN in addressing real-world optimization problems. The statistical results of IKOA-KNN validate its excellent classification accuracy with up to 100% on some of these datasets.
doi_str_mv 10.1007/s11227-024-06559-y
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_3120031616</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>3120031616</sourcerecordid><originalsourceid>FETCH-LOGICAL-c200t-62f2457fd5d36f7ff401bf2067a51a39148caed992394e25c0099cdf0b3f838e3</originalsourceid><addsrcrecordid>eNp9kMtOwzAURC0EEqXwA6wisTZcv-J4WVW81KosgLXlOnZxlTTBTpHK12MIEjtWdxZzZq4GoUsC1wRA3iRCKJUYKMdQCqHw4QhNiJAMA6_4MZqAooArwekpOktpCwCcSTZBz7Pa9EP4cMUCr1ZF64YYbGEbk1LwwZohdLtibZKriyxC28fuI-uF6xsXiy6jbfgcXabZdDEMb-05OvGmSe7i907R693ty_wBL5_uH-ezJbYUYMAl9ZQL6WtRs9JL7zmQtadQSiOIYYrwyhpXK0WZ4o4KC6CUrT2sma9Y5dgUXY25-an3vUuD3nb7uMuVmpFcwUhJyuyio8vGLqXovO5jaE08aAL6ezw9jqfzePpnPH3IEBuhlM27jYt_0f9QXwBvcs0</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>3120031616</pqid></control><display><type>article</type><title>Adaptive K-NN metric classification based on improved Kepler optimization algorithm</title><source>SpringerNature Journals</source><creator>Cai, Liang ; Zhao, Shijie ; Meng, Fanshuai ; Zhang, Tianran</creator><creatorcontrib>Cai, Liang ; Zhao, Shijie ; Meng, Fanshuai ; Zhang, Tianran</creatorcontrib><description>K-nearest neighbor (K-NN) method has been widely utilized in data mining and pattern recognition due to its elegant geometric basis and well-defined statistical characteristics. The nearest neighbor number K and the distance metric employed by K-NN have a great impact on the classification performance. To effectively improve the K-NN classification capacity, this article proposes an improved Kepler optimization algorithm (termed IKOA) with the interaction effect renewal mechanism and the dynamic rebalancing steady state mechanism for enhancing the global exploration and local exploitation. Furthermore, the IKOA-KNN algorithm, a novel K-NN variant, is constructed by integrating IKOA and K-NN for reinforcing the classification accuracy of the canonical K-NN. In the classification process, the proposed IKOA is used to optimize the adaptive K value of K-NN and the well-posed p -norm value of the distance metric, synchronously. In addition, feature selection based on IKOA is used to eliminate irrelevant and redundant features, so as to maintain or enhance classification accuracy. The experimental results of IKOA on CEC-2017 benchmark functions show that the proposed algorithm, compared with seven well-known meta-heuristic algorithms (i.e., WSO, BWO, DMOA, FLA, RSA, SPO, KOA), the proposed algorithm enhances the exploitation and exploration ability, improves the convergence speed and is more stable when solving optimization problems. Fourteen bi-and multi-class datasets from the UCI Machine Learning Repository are used to demonstrate the effectiveness of the proposed IKOA-KNN in addressing real-world optimization problems. The statistical results of IKOA-KNN validate its excellent classification accuracy with up to 100% on some of these datasets.</description><identifier>ISSN: 0920-8542</identifier><identifier>EISSN: 1573-0484</identifier><identifier>DOI: 10.1007/s11227-024-06559-y</identifier><language>eng</language><publisher>New York: Springer US</publisher><subject>Accuracy ; Adaptive algorithms ; Classification ; Compilers ; Computer Science ; Data mining ; Datasets ; Exploitation ; Heuristic methods ; Interpreters ; K-nearest neighbors algorithm ; Machine learning ; Optimization ; Optimization algorithms ; Pattern recognition ; Processor Architectures ; Programming Languages</subject><ispartof>The Journal of supercomputing, 2025, Vol.81 (1), Article 66</ispartof><rights>The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature 2024. Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><cites>FETCH-LOGICAL-c200t-62f2457fd5d36f7ff401bf2067a51a39148caed992394e25c0099cdf0b3f838e3</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://link.springer.com/content/pdf/10.1007/s11227-024-06559-y$$EPDF$$P50$$Gspringer$$H</linktopdf><linktohtml>$$Uhttps://link.springer.com/10.1007/s11227-024-06559-y$$EHTML$$P50$$Gspringer$$H</linktohtml><link.rule.ids>315,781,785,27928,27929,41492,42561,51323</link.rule.ids></links><search><creatorcontrib>Cai, Liang</creatorcontrib><creatorcontrib>Zhao, Shijie</creatorcontrib><creatorcontrib>Meng, Fanshuai</creatorcontrib><creatorcontrib>Zhang, Tianran</creatorcontrib><title>Adaptive K-NN metric classification based on improved Kepler optimization algorithm</title><title>The Journal of supercomputing</title><addtitle>J Supercomput</addtitle><description>K-nearest neighbor (K-NN) method has been widely utilized in data mining and pattern recognition due to its elegant geometric basis and well-defined statistical characteristics. The nearest neighbor number K and the distance metric employed by K-NN have a great impact on the classification performance. To effectively improve the K-NN classification capacity, this article proposes an improved Kepler optimization algorithm (termed IKOA) with the interaction effect renewal mechanism and the dynamic rebalancing steady state mechanism for enhancing the global exploration and local exploitation. Furthermore, the IKOA-KNN algorithm, a novel K-NN variant, is constructed by integrating IKOA and K-NN for reinforcing the classification accuracy of the canonical K-NN. In the classification process, the proposed IKOA is used to optimize the adaptive K value of K-NN and the well-posed p -norm value of the distance metric, synchronously. In addition, feature selection based on IKOA is used to eliminate irrelevant and redundant features, so as to maintain or enhance classification accuracy. The experimental results of IKOA on CEC-2017 benchmark functions show that the proposed algorithm, compared with seven well-known meta-heuristic algorithms (i.e., WSO, BWO, DMOA, FLA, RSA, SPO, KOA), the proposed algorithm enhances the exploitation and exploration ability, improves the convergence speed and is more stable when solving optimization problems. Fourteen bi-and multi-class datasets from the UCI Machine Learning Repository are used to demonstrate the effectiveness of the proposed IKOA-KNN in addressing real-world optimization problems. The statistical results of IKOA-KNN validate its excellent classification accuracy with up to 100% on some of these datasets.</description><subject>Accuracy</subject><subject>Adaptive algorithms</subject><subject>Classification</subject><subject>Compilers</subject><subject>Computer Science</subject><subject>Data mining</subject><subject>Datasets</subject><subject>Exploitation</subject><subject>Heuristic methods</subject><subject>Interpreters</subject><subject>K-nearest neighbors algorithm</subject><subject>Machine learning</subject><subject>Optimization</subject><subject>Optimization algorithms</subject><subject>Pattern recognition</subject><subject>Processor Architectures</subject><subject>Programming Languages</subject><issn>0920-8542</issn><issn>1573-0484</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2025</creationdate><recordtype>article</recordtype><recordid>eNp9kMtOwzAURC0EEqXwA6wisTZcv-J4WVW81KosgLXlOnZxlTTBTpHK12MIEjtWdxZzZq4GoUsC1wRA3iRCKJUYKMdQCqHw4QhNiJAMA6_4MZqAooArwekpOktpCwCcSTZBz7Pa9EP4cMUCr1ZF64YYbGEbk1LwwZohdLtibZKriyxC28fuI-uF6xsXiy6jbfgcXabZdDEMb-05OvGmSe7i907R693ty_wBL5_uH-ezJbYUYMAl9ZQL6WtRs9JL7zmQtadQSiOIYYrwyhpXK0WZ4o4KC6CUrT2sma9Y5dgUXY25-an3vUuD3nb7uMuVmpFcwUhJyuyio8vGLqXovO5jaE08aAL6ezw9jqfzePpnPH3IEBuhlM27jYt_0f9QXwBvcs0</recordid><startdate>2025</startdate><enddate>2025</enddate><creator>Cai, Liang</creator><creator>Zhao, Shijie</creator><creator>Meng, Fanshuai</creator><creator>Zhang, Tianran</creator><general>Springer US</general><general>Springer Nature B.V</general><scope>AAYXX</scope><scope>CITATION</scope></search><sort><creationdate>2025</creationdate><title>Adaptive K-NN metric classification based on improved Kepler optimization algorithm</title><author>Cai, Liang ; Zhao, Shijie ; Meng, Fanshuai ; Zhang, Tianran</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c200t-62f2457fd5d36f7ff401bf2067a51a39148caed992394e25c0099cdf0b3f838e3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2025</creationdate><topic>Accuracy</topic><topic>Adaptive algorithms</topic><topic>Classification</topic><topic>Compilers</topic><topic>Computer Science</topic><topic>Data mining</topic><topic>Datasets</topic><topic>Exploitation</topic><topic>Heuristic methods</topic><topic>Interpreters</topic><topic>K-nearest neighbors algorithm</topic><topic>Machine learning</topic><topic>Optimization</topic><topic>Optimization algorithms</topic><topic>Pattern recognition</topic><topic>Processor Architectures</topic><topic>Programming Languages</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Cai, Liang</creatorcontrib><creatorcontrib>Zhao, Shijie</creatorcontrib><creatorcontrib>Meng, Fanshuai</creatorcontrib><creatorcontrib>Zhang, Tianran</creatorcontrib><collection>CrossRef</collection><jtitle>The Journal of supercomputing</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Cai, Liang</au><au>Zhao, Shijie</au><au>Meng, Fanshuai</au><au>Zhang, Tianran</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Adaptive K-NN metric classification based on improved Kepler optimization algorithm</atitle><jtitle>The Journal of supercomputing</jtitle><stitle>J Supercomput</stitle><date>2025</date><risdate>2025</risdate><volume>81</volume><issue>1</issue><artnum>66</artnum><issn>0920-8542</issn><eissn>1573-0484</eissn><abstract>K-nearest neighbor (K-NN) method has been widely utilized in data mining and pattern recognition due to its elegant geometric basis and well-defined statistical characteristics. The nearest neighbor number K and the distance metric employed by K-NN have a great impact on the classification performance. To effectively improve the K-NN classification capacity, this article proposes an improved Kepler optimization algorithm (termed IKOA) with the interaction effect renewal mechanism and the dynamic rebalancing steady state mechanism for enhancing the global exploration and local exploitation. Furthermore, the IKOA-KNN algorithm, a novel K-NN variant, is constructed by integrating IKOA and K-NN for reinforcing the classification accuracy of the canonical K-NN. In the classification process, the proposed IKOA is used to optimize the adaptive K value of K-NN and the well-posed p -norm value of the distance metric, synchronously. In addition, feature selection based on IKOA is used to eliminate irrelevant and redundant features, so as to maintain or enhance classification accuracy. The experimental results of IKOA on CEC-2017 benchmark functions show that the proposed algorithm, compared with seven well-known meta-heuristic algorithms (i.e., WSO, BWO, DMOA, FLA, RSA, SPO, KOA), the proposed algorithm enhances the exploitation and exploration ability, improves the convergence speed and is more stable when solving optimization problems. Fourteen bi-and multi-class datasets from the UCI Machine Learning Repository are used to demonstrate the effectiveness of the proposed IKOA-KNN in addressing real-world optimization problems. The statistical results of IKOA-KNN validate its excellent classification accuracy with up to 100% on some of these datasets.</abstract><cop>New York</cop><pub>Springer US</pub><doi>10.1007/s11227-024-06559-y</doi></addata></record>
fulltext fulltext
identifier ISSN: 0920-8542
ispartof The Journal of supercomputing, 2025, Vol.81 (1), Article 66
issn 0920-8542
1573-0484
language eng
recordid cdi_proquest_journals_3120031616
source SpringerNature Journals
subjects Accuracy
Adaptive algorithms
Classification
Compilers
Computer Science
Data mining
Datasets
Exploitation
Heuristic methods
Interpreters
K-nearest neighbors algorithm
Machine learning
Optimization
Optimization algorithms
Pattern recognition
Processor Architectures
Programming Languages
title Adaptive K-NN metric classification based on improved Kepler optimization algorithm
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-17T13%3A00%3A46IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Adaptive%20K-NN%20metric%20classification%20based%20on%20improved%20Kepler%20optimization%20algorithm&rft.jtitle=The%20Journal%20of%20supercomputing&rft.au=Cai,%20Liang&rft.date=2025&rft.volume=81&rft.issue=1&rft.artnum=66&rft.issn=0920-8542&rft.eissn=1573-0484&rft_id=info:doi/10.1007/s11227-024-06559-y&rft_dat=%3Cproquest_cross%3E3120031616%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=3120031616&rft_id=info:pmid/&rfr_iscdi=true