Semi-Supervised SAR ATR Based on Contrastive Learning and Complementary Label Learning

Deep-learning-based methods have recently achieved significant advancements in synthetic aperture radar automatic target recognition (SAR ATR). However, these methods typically rely heavily on extensive annotations, which are difficult to obtain for SAR images. Semi-supervised learning offers a solu...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE geoscience and remote sensing letters 2024, Vol.21, p.1-5
Hauptverfasser: Li, Chen, Du, Lan, Du, Yuang
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 5
container_issue
container_start_page 1
container_title IEEE geoscience and remote sensing letters
container_volume 21
creator Li, Chen
Du, Lan
Du, Yuang
description Deep-learning-based methods have recently achieved significant advancements in synthetic aperture radar automatic target recognition (SAR ATR). However, these methods typically rely heavily on extensive annotations, which are difficult to obtain for SAR images. Semi-supervised learning offers a solution to improve model performance with limited labeled data by leveraging unlabeled data. The mainstream semi-supervised learning methods for SAR ATR typically select high-confidence unlabeled images to assign pseudo-labels for their inclusion in the model training process. However, the large number of low-confidence unlabeled images are not efficiently utilized. To address this issue, a semi-supervised SAR target recognition method based on contrastive learning and complementary label (CoL) learning is proposed. First, CoL learning assigns CoLto low-confidence unlabeled images based on their minimum prediction probabilities. Subsequently, a threshold is set to filter out unreliable CoL, thereby mitigating the adverse effects of erroneous CoL. This approach ensures the effective and comprehensive utilization of low-confidence unlabeled images. Additionally, we propose a contrastive loss that incorporates CoL. Compared to traditional contrastive losses, our proposed contrastive loss constructs a richer set of negative sample pairs by leveraging the characteristics of CoL more effectively. Consequently, this approach improves the utilization of low-confidence images and further improves recognition performance. In contrast to the current state-of-the-art semi-supervised recognition methods, experiments on the MSTAR dataset demonstrate the better recognition performance of our proposed method with limited labeled images.
doi_str_mv 10.1109/LGRS.2024.3458948
format Article
fullrecord <record><control><sourceid>proquest_RIE</sourceid><recordid>TN_cdi_ieee_primary_10693680</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>10693680</ieee_id><sourcerecordid>3109503900</sourcerecordid><originalsourceid>FETCH-LOGICAL-c176t-8fccd912dd016c2ec68dd8c914a47511abb694ea805683ee730bccc564d7b5423</originalsourceid><addsrcrecordid>eNpNkEtLAzEQgIMoWKs_QPAQ8Lw12Tw2OdaiVVgQulW8hWwylS3dh8m24L93lxbxNDPMNw8-hG4pmVFK9EO-XBWzlKR8xrhQmqszNKFCqISIjJ6POReJ0OrzEl3FuCUDqVQ2QR8F1FVS7DsIhyqCx8V8hefrFX60Y9U2eNE2fbCxrw6Ac7ChqZovbBs_NOpuBzU0vQ0_OLcl7P6Aa3SxsbsIN6c4Re_PT-vFS5K_LV8X8zxxNJN9ojbOeU1T7wmVLgUnlffKacotzwSltiyl5mAVEVIxgIyR0jknJPdZKXjKpuj-uLcL7fceYm-27T40w0nDBi2CME3IQNEj5UIbY4CN6UJVD18bSsyoz4z6zKjPnPQNM3fHmQoA_vFSM6kI-wUKjWuK</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>3109503900</pqid></control><display><type>article</type><title>Semi-Supervised SAR ATR Based on Contrastive Learning and Complementary Label Learning</title><source>IEEE Electronic Library (IEL)</source><creator>Li, Chen ; Du, Lan ; Du, Yuang</creator><creatorcontrib>Li, Chen ; Du, Lan ; Du, Yuang</creatorcontrib><description>Deep-learning-based methods have recently achieved significant advancements in synthetic aperture radar automatic target recognition (SAR ATR). However, these methods typically rely heavily on extensive annotations, which are difficult to obtain for SAR images. Semi-supervised learning offers a solution to improve model performance with limited labeled data by leveraging unlabeled data. The mainstream semi-supervised learning methods for SAR ATR typically select high-confidence unlabeled images to assign pseudo-labels for their inclusion in the model training process. However, the large number of low-confidence unlabeled images are not efficiently utilized. To address this issue, a semi-supervised SAR target recognition method based on contrastive learning and complementary label (CoL) learning is proposed. First, CoL learning assigns CoLto low-confidence unlabeled images based on their minimum prediction probabilities. Subsequently, a threshold is set to filter out unreliable CoL, thereby mitigating the adverse effects of erroneous CoL. This approach ensures the effective and comprehensive utilization of low-confidence unlabeled images. Additionally, we propose a contrastive loss that incorporates CoL. Compared to traditional contrastive losses, our proposed contrastive loss constructs a richer set of negative sample pairs by leveraging the characteristics of CoL more effectively. Consequently, this approach improves the utilization of low-confidence images and further improves recognition performance. In contrast to the current state-of-the-art semi-supervised recognition methods, experiments on the MSTAR dataset demonstrate the better recognition performance of our proposed method with limited labeled images.</description><identifier>ISSN: 1545-598X</identifier><identifier>EISSN: 1558-0571</identifier><identifier>DOI: 10.1109/LGRS.2024.3458948</identifier><identifier>CODEN: IGRSBY</identifier><language>eng</language><publisher>Piscataway: IEEE</publisher><subject>Annotations ; Automatic target recognition ; Complementary label (CoL) learning ; Contrastive learning ; Deep learning ; Geoscience and remote sensing ; Image contrast ; Image filters ; Image recognition ; Labels ; Learning ; Object detection ; Radar remote sensing ; SAR (radar) ; Semi-supervised learning ; Synthetic aperture radar ; synthetic aperture radar (SAR) ; Target recognition ; Training</subject><ispartof>IEEE geoscience and remote sensing letters, 2024, Vol.21, p.1-5</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2024</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><cites>FETCH-LOGICAL-c176t-8fccd912dd016c2ec68dd8c914a47511abb694ea805683ee730bccc564d7b5423</cites><orcidid>0000-0002-6525-7056 ; 0000-0002-4503-0022 ; 0000-0002-1129-9453</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/10693680$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>314,780,784,796,4024,27923,27924,27925,54758</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/10693680$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Li, Chen</creatorcontrib><creatorcontrib>Du, Lan</creatorcontrib><creatorcontrib>Du, Yuang</creatorcontrib><title>Semi-Supervised SAR ATR Based on Contrastive Learning and Complementary Label Learning</title><title>IEEE geoscience and remote sensing letters</title><addtitle>LGRS</addtitle><description>Deep-learning-based methods have recently achieved significant advancements in synthetic aperture radar automatic target recognition (SAR ATR). However, these methods typically rely heavily on extensive annotations, which are difficult to obtain for SAR images. Semi-supervised learning offers a solution to improve model performance with limited labeled data by leveraging unlabeled data. The mainstream semi-supervised learning methods for SAR ATR typically select high-confidence unlabeled images to assign pseudo-labels for their inclusion in the model training process. However, the large number of low-confidence unlabeled images are not efficiently utilized. To address this issue, a semi-supervised SAR target recognition method based on contrastive learning and complementary label (CoL) learning is proposed. First, CoL learning assigns CoLto low-confidence unlabeled images based on their minimum prediction probabilities. Subsequently, a threshold is set to filter out unreliable CoL, thereby mitigating the adverse effects of erroneous CoL. This approach ensures the effective and comprehensive utilization of low-confidence unlabeled images. Additionally, we propose a contrastive loss that incorporates CoL. Compared to traditional contrastive losses, our proposed contrastive loss constructs a richer set of negative sample pairs by leveraging the characteristics of CoL more effectively. Consequently, this approach improves the utilization of low-confidence images and further improves recognition performance. In contrast to the current state-of-the-art semi-supervised recognition methods, experiments on the MSTAR dataset demonstrate the better recognition performance of our proposed method with limited labeled images.</description><subject>Annotations</subject><subject>Automatic target recognition</subject><subject>Complementary label (CoL) learning</subject><subject>Contrastive learning</subject><subject>Deep learning</subject><subject>Geoscience and remote sensing</subject><subject>Image contrast</subject><subject>Image filters</subject><subject>Image recognition</subject><subject>Labels</subject><subject>Learning</subject><subject>Object detection</subject><subject>Radar remote sensing</subject><subject>SAR (radar)</subject><subject>Semi-supervised learning</subject><subject>Synthetic aperture radar</subject><subject>synthetic aperture radar (SAR)</subject><subject>Target recognition</subject><subject>Training</subject><issn>1545-598X</issn><issn>1558-0571</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><sourceid>RIE</sourceid><recordid>eNpNkEtLAzEQgIMoWKs_QPAQ8Lw12Tw2OdaiVVgQulW8hWwylS3dh8m24L93lxbxNDPMNw8-hG4pmVFK9EO-XBWzlKR8xrhQmqszNKFCqISIjJ6POReJ0OrzEl3FuCUDqVQ2QR8F1FVS7DsIhyqCx8V8hefrFX60Y9U2eNE2fbCxrw6Ac7ChqZovbBs_NOpuBzU0vQ0_OLcl7P6Aa3SxsbsIN6c4Re_PT-vFS5K_LV8X8zxxNJN9ojbOeU1T7wmVLgUnlffKacotzwSltiyl5mAVEVIxgIyR0jknJPdZKXjKpuj-uLcL7fceYm-27T40w0nDBi2CME3IQNEj5UIbY4CN6UJVD18bSsyoz4z6zKjPnPQNM3fHmQoA_vFSM6kI-wUKjWuK</recordid><startdate>2024</startdate><enddate>2024</enddate><creator>Li, Chen</creator><creator>Du, Lan</creator><creator>Du, Yuang</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>RIA</scope><scope>RIE</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>7SP</scope><scope>7TG</scope><scope>7UA</scope><scope>8FD</scope><scope>C1K</scope><scope>F1W</scope><scope>FR3</scope><scope>H8D</scope><scope>H96</scope><scope>JQ2</scope><scope>KL.</scope><scope>KR7</scope><scope>L.G</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><orcidid>https://orcid.org/0000-0002-6525-7056</orcidid><orcidid>https://orcid.org/0000-0002-4503-0022</orcidid><orcidid>https://orcid.org/0000-0002-1129-9453</orcidid></search><sort><creationdate>2024</creationdate><title>Semi-Supervised SAR ATR Based on Contrastive Learning and Complementary Label Learning</title><author>Li, Chen ; Du, Lan ; Du, Yuang</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c176t-8fccd912dd016c2ec68dd8c914a47511abb694ea805683ee730bccc564d7b5423</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Annotations</topic><topic>Automatic target recognition</topic><topic>Complementary label (CoL) learning</topic><topic>Contrastive learning</topic><topic>Deep learning</topic><topic>Geoscience and remote sensing</topic><topic>Image contrast</topic><topic>Image filters</topic><topic>Image recognition</topic><topic>Labels</topic><topic>Learning</topic><topic>Object detection</topic><topic>Radar remote sensing</topic><topic>SAR (radar)</topic><topic>Semi-supervised learning</topic><topic>Synthetic aperture radar</topic><topic>synthetic aperture radar (SAR)</topic><topic>Target recognition</topic><topic>Training</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Li, Chen</creatorcontrib><creatorcontrib>Du, Lan</creatorcontrib><creatorcontrib>Du, Yuang</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Electronics &amp; Communications Abstracts</collection><collection>Meteorological &amp; Geoastrophysical Abstracts</collection><collection>Water Resources Abstracts</collection><collection>Technology Research Database</collection><collection>Environmental Sciences and Pollution Management</collection><collection>ASFA: Aquatic Sciences and Fisheries Abstracts</collection><collection>Engineering Research Database</collection><collection>Aerospace Database</collection><collection>Aquatic Science &amp; Fisheries Abstracts (ASFA) 2: Ocean Technology, Policy &amp; Non-Living Resources</collection><collection>ProQuest Computer Science Collection</collection><collection>Meteorological &amp; Geoastrophysical Abstracts - Academic</collection><collection>Civil Engineering Abstracts</collection><collection>Aquatic Science &amp; Fisheries Abstracts (ASFA) Professional</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><jtitle>IEEE geoscience and remote sensing letters</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Li, Chen</au><au>Du, Lan</au><au>Du, Yuang</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Semi-Supervised SAR ATR Based on Contrastive Learning and Complementary Label Learning</atitle><jtitle>IEEE geoscience and remote sensing letters</jtitle><stitle>LGRS</stitle><date>2024</date><risdate>2024</risdate><volume>21</volume><spage>1</spage><epage>5</epage><pages>1-5</pages><issn>1545-598X</issn><eissn>1558-0571</eissn><coden>IGRSBY</coden><abstract>Deep-learning-based methods have recently achieved significant advancements in synthetic aperture radar automatic target recognition (SAR ATR). However, these methods typically rely heavily on extensive annotations, which are difficult to obtain for SAR images. Semi-supervised learning offers a solution to improve model performance with limited labeled data by leveraging unlabeled data. The mainstream semi-supervised learning methods for SAR ATR typically select high-confidence unlabeled images to assign pseudo-labels for their inclusion in the model training process. However, the large number of low-confidence unlabeled images are not efficiently utilized. To address this issue, a semi-supervised SAR target recognition method based on contrastive learning and complementary label (CoL) learning is proposed. First, CoL learning assigns CoLto low-confidence unlabeled images based on their minimum prediction probabilities. Subsequently, a threshold is set to filter out unreliable CoL, thereby mitigating the adverse effects of erroneous CoL. This approach ensures the effective and comprehensive utilization of low-confidence unlabeled images. Additionally, we propose a contrastive loss that incorporates CoL. Compared to traditional contrastive losses, our proposed contrastive loss constructs a richer set of negative sample pairs by leveraging the characteristics of CoL more effectively. Consequently, this approach improves the utilization of low-confidence images and further improves recognition performance. In contrast to the current state-of-the-art semi-supervised recognition methods, experiments on the MSTAR dataset demonstrate the better recognition performance of our proposed method with limited labeled images.</abstract><cop>Piscataway</cop><pub>IEEE</pub><doi>10.1109/LGRS.2024.3458948</doi><tpages>5</tpages><orcidid>https://orcid.org/0000-0002-6525-7056</orcidid><orcidid>https://orcid.org/0000-0002-4503-0022</orcidid><orcidid>https://orcid.org/0000-0002-1129-9453</orcidid></addata></record>
fulltext fulltext_linktorsrc
identifier ISSN: 1545-598X
ispartof IEEE geoscience and remote sensing letters, 2024, Vol.21, p.1-5
issn 1545-598X
1558-0571
language eng
recordid cdi_ieee_primary_10693680
source IEEE Electronic Library (IEL)
subjects Annotations
Automatic target recognition
Complementary label (CoL) learning
Contrastive learning
Deep learning
Geoscience and remote sensing
Image contrast
Image filters
Image recognition
Labels
Learning
Object detection
Radar remote sensing
SAR (radar)
Semi-supervised learning
Synthetic aperture radar
synthetic aperture radar (SAR)
Target recognition
Training
title Semi-Supervised SAR ATR Based on Contrastive Learning and Complementary Label Learning
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-01T17%3A28%3A53IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_RIE&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Semi-Supervised%20SAR%20ATR%20Based%20on%20Contrastive%20Learning%20and%20Complementary%20Label%20Learning&rft.jtitle=IEEE%20geoscience%20and%20remote%20sensing%20letters&rft.au=Li,%20Chen&rft.date=2024&rft.volume=21&rft.spage=1&rft.epage=5&rft.pages=1-5&rft.issn=1545-598X&rft.eissn=1558-0571&rft.coden=IGRSBY&rft_id=info:doi/10.1109/LGRS.2024.3458948&rft_dat=%3Cproquest_RIE%3E3109503900%3C/proquest_RIE%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=3109503900&rft_id=info:pmid/&rft_ieee_id=10693680&rfr_iscdi=true