NAS-MFF: NAS-Guided Multiscale Feature Fusion Network With Pareto Optimization for Sonar Images Classification

Underwater target recognition technology based on sonar images has received considerable critical attention in recent years. However, the sonar sensors encounter disturbance from seafloor reverberation noise and a complicated background, resulting in notable difficulties for precise sonar target cla...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE sensors journal 2024-05, Vol.24 (9), p.14656-14667
Hauptverfasser: Chen, Yule, Liang, Hong, Jiao, Shaohua
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 14667
container_issue 9
container_start_page 14656
container_title IEEE sensors journal
container_volume 24
creator Chen, Yule
Liang, Hong
Jiao, Shaohua
description Underwater target recognition technology based on sonar images has received considerable critical attention in recent years. However, the sonar sensors encounter disturbance from seafloor reverberation noise and a complicated background, resulting in notable difficulties for precise sonar target classification. On the other hand, traditional machine learning methods inevitably lose features relying on expert systems, and manual network creation is relatively inefficient with limited sonar data. To tackle these challenges, we propose a neural architecture search (NAS)-guided multiscale feature fusion (NAS-MFF) algorithm for sonar images classification based on the differentiable architecture search. Specifically, our approach consists of two stages: a search stage with the Pareto optimization, and a training stage using the optimal architecture. NAS-MFF begins by reconfiguring the search space based on the characteristics of sonar images, which includes the introduction of the MF Block {k} with multiscale feature extraction ability. By synergizing a recognition-driven convolutional neural network (CNN) with Pareto optimization, it achieves a dual advantage in both accuracy and model efficiency using the available data. Extensive experiments on three sonar image datasets of different sizes and sources distinctly demonstrate that NAS-MFF outperforms several existing manual design methods and NAS approaches.
doi_str_mv 10.1109/JSEN.2024.3375372
format Article
fullrecord <record><control><sourceid>proquest_RIE</sourceid><recordid>TN_cdi_crossref_primary_10_1109_JSEN_2024_3375372</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>10473679</ieee_id><sourcerecordid>3049492297</sourcerecordid><originalsourceid>FETCH-LOGICAL-c294t-3bb932d54aa7e299d01f3247f711e7e7e1e5ed4d6694faf4023e149ce472ddf03</originalsourceid><addsrcrecordid>eNpNkEtLAzEUhYMoWKs_QHARcD01r2kad6V0aqUPoYruhnRyo6ntTE0yiP56Z6wLuYtz4H7nXjgIXVLSo5Som_vVeNFjhIke5zLlkh2hDk3TQUKlGBy3npNEcPlyis5C2BBClUxlB5WL4SqZZ9ktbs2kdgYMntfb6EKht4Az0LH2jdbBVSVeQPys_Dt-dvENP2gPscLLfXQ7961jC9jK41VVao-nO_0KAY-2OgRnXfG7P0cnVm8DXPxpFz1l48fRXTJbTqaj4SwpmBIx4eu14sykQmsJTClDqOVMSCspBdkMhRSMMP2-ElZbQRgHKlQBQjJjLOFddH24u_fVRw0h5puq9mXzMudEKKEYU7Kh6IEqfBWCB5vvvdtp_5VTkre15m2teVtr_ldrk7k6ZBwA_OOF5H2p-A-Y6nPs</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>3049492297</pqid></control><display><type>article</type><title>NAS-MFF: NAS-Guided Multiscale Feature Fusion Network With Pareto Optimization for Sonar Images Classification</title><source>IEEE Electronic Library (IEL)</source><creator>Chen, Yule ; Liang, Hong ; Jiao, Shaohua</creator><creatorcontrib>Chen, Yule ; Liang, Hong ; Jiao, Shaohua</creatorcontrib><description>Underwater target recognition technology based on sonar images has received considerable critical attention in recent years. However, the sonar sensors encounter disturbance from seafloor reverberation noise and a complicated background, resulting in notable difficulties for precise sonar target classification. On the other hand, traditional machine learning methods inevitably lose features relying on expert systems, and manual network creation is relatively inefficient with limited sonar data. To tackle these challenges, we propose a neural architecture search (NAS)-guided multiscale feature fusion (NAS-MFF) algorithm for sonar images classification based on the differentiable architecture search. Specifically, our approach consists of two stages: a search stage with the Pareto optimization, and a training stage using the optimal architecture. NAS-MFF begins by reconfiguring the search space based on the characteristics of sonar images, which includes the introduction of the MF Block &lt;inline-formula&gt; &lt;tex-math notation="LaTeX"&gt;{k} &lt;/tex-math&gt;&lt;/inline-formula&gt; with multiscale feature extraction ability. By synergizing a recognition-driven convolutional neural network (CNN) with Pareto optimization, it achieves a dual advantage in both accuracy and model efficiency using the available data. Extensive experiments on three sonar image datasets of different sizes and sources distinctly demonstrate that NAS-MFF outperforms several existing manual design methods and NAS approaches.</description><identifier>ISSN: 1530-437X</identifier><identifier>EISSN: 1558-1748</identifier><identifier>DOI: 10.1109/JSEN.2024.3375372</identifier><identifier>CODEN: ISJEAZ</identifier><language>eng</language><publisher>New York: IEEE</publisher><subject>Algorithms ; Artificial neural networks ; Background noise ; Deep learning (DL) ; Expert systems ; Feature extraction ; few-shot ; Image classification ; Imaging ; Machine learning ; multiobjective optimization ; neural architecture search (NAS) ; Neural networks ; Ocean floor ; Optimization ; Pareto optimization ; Searching ; Sonar ; Sonar applications ; sonar images classification ; Target recognition</subject><ispartof>IEEE sensors journal, 2024-05, Vol.24 (9), p.14656-14667</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2024</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c294t-3bb932d54aa7e299d01f3247f711e7e7e1e5ed4d6694faf4023e149ce472ddf03</citedby><cites>FETCH-LOGICAL-c294t-3bb932d54aa7e299d01f3247f711e7e7e1e5ed4d6694faf4023e149ce472ddf03</cites><orcidid>0000-0001-5523-9113 ; 0009-0001-2736-5582 ; 0000-0001-5960-2549</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/10473679$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>314,776,780,792,27901,27902,54733</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/10473679$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Chen, Yule</creatorcontrib><creatorcontrib>Liang, Hong</creatorcontrib><creatorcontrib>Jiao, Shaohua</creatorcontrib><title>NAS-MFF: NAS-Guided Multiscale Feature Fusion Network With Pareto Optimization for Sonar Images Classification</title><title>IEEE sensors journal</title><addtitle>JSEN</addtitle><description>Underwater target recognition technology based on sonar images has received considerable critical attention in recent years. However, the sonar sensors encounter disturbance from seafloor reverberation noise and a complicated background, resulting in notable difficulties for precise sonar target classification. On the other hand, traditional machine learning methods inevitably lose features relying on expert systems, and manual network creation is relatively inefficient with limited sonar data. To tackle these challenges, we propose a neural architecture search (NAS)-guided multiscale feature fusion (NAS-MFF) algorithm for sonar images classification based on the differentiable architecture search. Specifically, our approach consists of two stages: a search stage with the Pareto optimization, and a training stage using the optimal architecture. NAS-MFF begins by reconfiguring the search space based on the characteristics of sonar images, which includes the introduction of the MF Block &lt;inline-formula&gt; &lt;tex-math notation="LaTeX"&gt;{k} &lt;/tex-math&gt;&lt;/inline-formula&gt; with multiscale feature extraction ability. By synergizing a recognition-driven convolutional neural network (CNN) with Pareto optimization, it achieves a dual advantage in both accuracy and model efficiency using the available data. Extensive experiments on three sonar image datasets of different sizes and sources distinctly demonstrate that NAS-MFF outperforms several existing manual design methods and NAS approaches.</description><subject>Algorithms</subject><subject>Artificial neural networks</subject><subject>Background noise</subject><subject>Deep learning (DL)</subject><subject>Expert systems</subject><subject>Feature extraction</subject><subject>few-shot</subject><subject>Image classification</subject><subject>Imaging</subject><subject>Machine learning</subject><subject>multiobjective optimization</subject><subject>neural architecture search (NAS)</subject><subject>Neural networks</subject><subject>Ocean floor</subject><subject>Optimization</subject><subject>Pareto optimization</subject><subject>Searching</subject><subject>Sonar</subject><subject>Sonar applications</subject><subject>sonar images classification</subject><subject>Target recognition</subject><issn>1530-437X</issn><issn>1558-1748</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><sourceid>RIE</sourceid><recordid>eNpNkEtLAzEUhYMoWKs_QHARcD01r2kad6V0aqUPoYruhnRyo6ntTE0yiP56Z6wLuYtz4H7nXjgIXVLSo5Som_vVeNFjhIke5zLlkh2hDk3TQUKlGBy3npNEcPlyis5C2BBClUxlB5WL4SqZZ9ktbs2kdgYMntfb6EKht4Az0LH2jdbBVSVeQPys_Dt-dvENP2gPscLLfXQ7961jC9jK41VVao-nO_0KAY-2OgRnXfG7P0cnVm8DXPxpFz1l48fRXTJbTqaj4SwpmBIx4eu14sykQmsJTClDqOVMSCspBdkMhRSMMP2-ElZbQRgHKlQBQjJjLOFddH24u_fVRw0h5puq9mXzMudEKKEYU7Kh6IEqfBWCB5vvvdtp_5VTkre15m2teVtr_ldrk7k6ZBwA_OOF5H2p-A-Y6nPs</recordid><startdate>20240501</startdate><enddate>20240501</enddate><creator>Chen, Yule</creator><creator>Liang, Hong</creator><creator>Jiao, Shaohua</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>RIA</scope><scope>RIE</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SP</scope><scope>7U5</scope><scope>8FD</scope><scope>L7M</scope><orcidid>https://orcid.org/0000-0001-5523-9113</orcidid><orcidid>https://orcid.org/0009-0001-2736-5582</orcidid><orcidid>https://orcid.org/0000-0001-5960-2549</orcidid></search><sort><creationdate>20240501</creationdate><title>NAS-MFF: NAS-Guided Multiscale Feature Fusion Network With Pareto Optimization for Sonar Images Classification</title><author>Chen, Yule ; Liang, Hong ; Jiao, Shaohua</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c294t-3bb932d54aa7e299d01f3247f711e7e7e1e5ed4d6694faf4023e149ce472ddf03</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Algorithms</topic><topic>Artificial neural networks</topic><topic>Background noise</topic><topic>Deep learning (DL)</topic><topic>Expert systems</topic><topic>Feature extraction</topic><topic>few-shot</topic><topic>Image classification</topic><topic>Imaging</topic><topic>Machine learning</topic><topic>multiobjective optimization</topic><topic>neural architecture search (NAS)</topic><topic>Neural networks</topic><topic>Ocean floor</topic><topic>Optimization</topic><topic>Pareto optimization</topic><topic>Searching</topic><topic>Sonar</topic><topic>Sonar applications</topic><topic>sonar images classification</topic><topic>Target recognition</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Chen, Yule</creatorcontrib><creatorcontrib>Liang, Hong</creatorcontrib><creatorcontrib>Jiao, Shaohua</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>CrossRef</collection><collection>Electronics &amp; Communications Abstracts</collection><collection>Solid State and Superconductivity Abstracts</collection><collection>Technology Research Database</collection><collection>Advanced Technologies Database with Aerospace</collection><jtitle>IEEE sensors journal</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Chen, Yule</au><au>Liang, Hong</au><au>Jiao, Shaohua</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>NAS-MFF: NAS-Guided Multiscale Feature Fusion Network With Pareto Optimization for Sonar Images Classification</atitle><jtitle>IEEE sensors journal</jtitle><stitle>JSEN</stitle><date>2024-05-01</date><risdate>2024</risdate><volume>24</volume><issue>9</issue><spage>14656</spage><epage>14667</epage><pages>14656-14667</pages><issn>1530-437X</issn><eissn>1558-1748</eissn><coden>ISJEAZ</coden><abstract>Underwater target recognition technology based on sonar images has received considerable critical attention in recent years. However, the sonar sensors encounter disturbance from seafloor reverberation noise and a complicated background, resulting in notable difficulties for precise sonar target classification. On the other hand, traditional machine learning methods inevitably lose features relying on expert systems, and manual network creation is relatively inefficient with limited sonar data. To tackle these challenges, we propose a neural architecture search (NAS)-guided multiscale feature fusion (NAS-MFF) algorithm for sonar images classification based on the differentiable architecture search. Specifically, our approach consists of two stages: a search stage with the Pareto optimization, and a training stage using the optimal architecture. NAS-MFF begins by reconfiguring the search space based on the characteristics of sonar images, which includes the introduction of the MF Block &lt;inline-formula&gt; &lt;tex-math notation="LaTeX"&gt;{k} &lt;/tex-math&gt;&lt;/inline-formula&gt; with multiscale feature extraction ability. By synergizing a recognition-driven convolutional neural network (CNN) with Pareto optimization, it achieves a dual advantage in both accuracy and model efficiency using the available data. Extensive experiments on three sonar image datasets of different sizes and sources distinctly demonstrate that NAS-MFF outperforms several existing manual design methods and NAS approaches.</abstract><cop>New York</cop><pub>IEEE</pub><doi>10.1109/JSEN.2024.3375372</doi><tpages>12</tpages><orcidid>https://orcid.org/0000-0001-5523-9113</orcidid><orcidid>https://orcid.org/0009-0001-2736-5582</orcidid><orcidid>https://orcid.org/0000-0001-5960-2549</orcidid></addata></record>
fulltext fulltext_linktorsrc
identifier ISSN: 1530-437X
ispartof IEEE sensors journal, 2024-05, Vol.24 (9), p.14656-14667
issn 1530-437X
1558-1748
language eng
recordid cdi_crossref_primary_10_1109_JSEN_2024_3375372
source IEEE Electronic Library (IEL)
subjects Algorithms
Artificial neural networks
Background noise
Deep learning (DL)
Expert systems
Feature extraction
few-shot
Image classification
Imaging
Machine learning
multiobjective optimization
neural architecture search (NAS)
Neural networks
Ocean floor
Optimization
Pareto optimization
Searching
Sonar
Sonar applications
sonar images classification
Target recognition
title NAS-MFF: NAS-Guided Multiscale Feature Fusion Network With Pareto Optimization for Sonar Images Classification
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-15T23%3A52%3A35IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_RIE&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=NAS-MFF:%20NAS-Guided%20Multiscale%20Feature%20Fusion%20Network%20With%20Pareto%20Optimization%20for%20Sonar%20Images%20Classification&rft.jtitle=IEEE%20sensors%20journal&rft.au=Chen,%20Yule&rft.date=2024-05-01&rft.volume=24&rft.issue=9&rft.spage=14656&rft.epage=14667&rft.pages=14656-14667&rft.issn=1530-437X&rft.eissn=1558-1748&rft.coden=ISJEAZ&rft_id=info:doi/10.1109/JSEN.2024.3375372&rft_dat=%3Cproquest_RIE%3E3049492297%3C/proquest_RIE%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=3049492297&rft_id=info:pmid/&rft_ieee_id=10473679&rfr_iscdi=true