Separation Fusion Transformer and Efficient Reuse Matching Network for Aerial Tracking
Due to the uniqueness of its perspective and the continuity of tracking, the current utilization of UAV in tracking has demonstrated immense potential in the field of aviation remote sensing. However, constrained by the complexities of the real-world environment during the tracking process and the l...
Gespeichert in:
Veröffentlicht in: | IEEE geoscience and remote sensing letters 2024, Vol.21, p.1-5 |
---|---|
Hauptverfasser: | , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 5 |
---|---|
container_issue | |
container_start_page | 1 |
container_title | IEEE geoscience and remote sensing letters |
container_volume | 21 |
creator | Deng, Anping Chen, Dianbing Han, Guangliang Yang, Hang Liu, Zhichao Liu, Faxue |
description | Due to the uniqueness of its perspective and the continuity of tracking, the current utilization of UAV in tracking has demonstrated immense potential in the field of aviation remote sensing. However, constrained by the complexities of the real-world environment during the tracking process and the limited computational capabilities of the onboard computing platform, the existing tracking networks struggle to effectively integrate superior tracking performance with efficient computational speed. Addressing this pivotal issue, we proposed the Separation Fusion Transformer and Efficient Reuse Matching Network for Aerial tracking (SFTrack). Specifically, we designed the separation fusion transformer (SFT), leveraging a meticulously crafted low-latency transformer architecture to extract robust feature information of the target, thereby enhancing the algorithm's ability to distinguish the target from intricate backgrounds. Furthermore, the efficient reuse matching network (ERM) efficiently performs feature matching to accurately determine the target's scale and location. SFTrack exhibits efficient and precise performance in drone tracking tasks. When compared to other drone-to-ground tracking algorithms, SFTrack demonstrates a significant lead in accuracy across multiple datasets, achieving an impressive 54 frames per second (FPS) on embedded platforms. This validates the feasibility and effectiveness of our tracker in practical drone deployments. |
doi_str_mv | 10.1109/LGRS.2024.3436846 |
format | Article |
fullrecord | <record><control><sourceid>proquest_RIE</sourceid><recordid>TN_cdi_proquest_journals_3100625532</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>10620024</ieee_id><sourcerecordid>3100625532</sourcerecordid><originalsourceid>FETCH-LOGICAL-c176t-99856c85792f0e7f7cdcdaf9487516cdeb60d001fb9cdaafcbcb65f03501d5d83</originalsourceid><addsrcrecordid>eNpNkE1LAzEQhoMoWKs_QPAQ8Lw12d1ks8dS2ipUhbaKt5DNTjT92K3JLuK_N6E9eJph5nln4EHolpIRpaR8WMyXq1FK0nyU5RkXOT9DA8qYSAgr6Hnsc5awUnxcoivvNySQQhQD9L6Cg3Kqs22DZ72PZe1U403r9uCwamo8NcZqC02Hl9B7wM-q01-2-cQv0P20bosDi8fgrNrFrN6G3TW6MGrn4eZUh-htNl1PHpPF6_xpMl4kmha8S8pSMK4FK8rUEChMoWtdK1PmomCU6xoqTmpCqKnKMFdGV7rizJCMEVqzWmRDdH-8e3Dtdw--k5u2d014KTNKCE8Zy9JA0SOlXeu9AyMPzu6V-5WUyKhPRn0y6pMnfSFzd8xYAPjH8zS6y_4AYANs8w</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>3100625532</pqid></control><display><type>article</type><title>Separation Fusion Transformer and Efficient Reuse Matching Network for Aerial Tracking</title><source>IEEE Electronic Library (IEL)</source><creator>Deng, Anping ; Chen, Dianbing ; Han, Guangliang ; Yang, Hang ; Liu, Zhichao ; Liu, Faxue</creator><creatorcontrib>Deng, Anping ; Chen, Dianbing ; Han, Guangliang ; Yang, Hang ; Liu, Zhichao ; Liu, Faxue</creatorcontrib><description>Due to the uniqueness of its perspective and the continuity of tracking, the current utilization of UAV in tracking has demonstrated immense potential in the field of aviation remote sensing. However, constrained by the complexities of the real-world environment during the tracking process and the limited computational capabilities of the onboard computing platform, the existing tracking networks struggle to effectively integrate superior tracking performance with efficient computational speed. Addressing this pivotal issue, we proposed the Separation Fusion Transformer and Efficient Reuse Matching Network for Aerial tracking (SFTrack). Specifically, we designed the separation fusion transformer (SFT), leveraging a meticulously crafted low-latency transformer architecture to extract robust feature information of the target, thereby enhancing the algorithm's ability to distinguish the target from intricate backgrounds. Furthermore, the efficient reuse matching network (ERM) efficiently performs feature matching to accurately determine the target's scale and location. SFTrack exhibits efficient and precise performance in drone tracking tasks. When compared to other drone-to-ground tracking algorithms, SFTrack demonstrates a significant lead in accuracy across multiple datasets, achieving an impressive 54 frames per second (FPS) on embedded platforms. This validates the feasibility and effectiveness of our tracker in practical drone deployments.</description><identifier>ISSN: 1545-598X</identifier><identifier>EISSN: 1558-0571</identifier><identifier>DOI: 10.1109/LGRS.2024.3436846</identifier><identifier>CODEN: IGRSBY</identifier><language>eng</language><publisher>Piscataway: IEEE</publisher><subject>Aerial tracking ; Algorithms ; Aviation ; Computer applications ; Correlation ; Data mining ; efficient transformer ; Feature extraction ; Frames per second ; Information processing ; Latency ; Matching ; Network latency ; Remote sensing ; Separation ; Target tracking ; Task analysis ; Tracking ; Tracking networks ; Training ; Transformers ; Unmanned aerial vehicles</subject><ispartof>IEEE geoscience and remote sensing letters, 2024, Vol.21, p.1-5</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2024</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><cites>FETCH-LOGICAL-c176t-99856c85792f0e7f7cdcdaf9487516cdeb60d001fb9cdaafcbcb65f03501d5d83</cites><orcidid>0000-0003-1952-2370 ; 0000-0002-0419-7176 ; 0000-0001-6027-1337 ; 0009-0009-0963-6236</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/10620024$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>314,776,780,792,4010,27900,27901,27902,54733</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/10620024$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Deng, Anping</creatorcontrib><creatorcontrib>Chen, Dianbing</creatorcontrib><creatorcontrib>Han, Guangliang</creatorcontrib><creatorcontrib>Yang, Hang</creatorcontrib><creatorcontrib>Liu, Zhichao</creatorcontrib><creatorcontrib>Liu, Faxue</creatorcontrib><title>Separation Fusion Transformer and Efficient Reuse Matching Network for Aerial Tracking</title><title>IEEE geoscience and remote sensing letters</title><addtitle>LGRS</addtitle><description>Due to the uniqueness of its perspective and the continuity of tracking, the current utilization of UAV in tracking has demonstrated immense potential in the field of aviation remote sensing. However, constrained by the complexities of the real-world environment during the tracking process and the limited computational capabilities of the onboard computing platform, the existing tracking networks struggle to effectively integrate superior tracking performance with efficient computational speed. Addressing this pivotal issue, we proposed the Separation Fusion Transformer and Efficient Reuse Matching Network for Aerial tracking (SFTrack). Specifically, we designed the separation fusion transformer (SFT), leveraging a meticulously crafted low-latency transformer architecture to extract robust feature information of the target, thereby enhancing the algorithm's ability to distinguish the target from intricate backgrounds. Furthermore, the efficient reuse matching network (ERM) efficiently performs feature matching to accurately determine the target's scale and location. SFTrack exhibits efficient and precise performance in drone tracking tasks. When compared to other drone-to-ground tracking algorithms, SFTrack demonstrates a significant lead in accuracy across multiple datasets, achieving an impressive 54 frames per second (FPS) on embedded platforms. This validates the feasibility and effectiveness of our tracker in practical drone deployments.</description><subject>Aerial tracking</subject><subject>Algorithms</subject><subject>Aviation</subject><subject>Computer applications</subject><subject>Correlation</subject><subject>Data mining</subject><subject>efficient transformer</subject><subject>Feature extraction</subject><subject>Frames per second</subject><subject>Information processing</subject><subject>Latency</subject><subject>Matching</subject><subject>Network latency</subject><subject>Remote sensing</subject><subject>Separation</subject><subject>Target tracking</subject><subject>Task analysis</subject><subject>Tracking</subject><subject>Tracking networks</subject><subject>Training</subject><subject>Transformers</subject><subject>Unmanned aerial vehicles</subject><issn>1545-598X</issn><issn>1558-0571</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><sourceid>RIE</sourceid><recordid>eNpNkE1LAzEQhoMoWKs_QPAQ8Lw12d1ks8dS2ipUhbaKt5DNTjT92K3JLuK_N6E9eJph5nln4EHolpIRpaR8WMyXq1FK0nyU5RkXOT9DA8qYSAgr6Hnsc5awUnxcoivvNySQQhQD9L6Cg3Kqs22DZ72PZe1U403r9uCwamo8NcZqC02Hl9B7wM-q01-2-cQv0P20bosDi8fgrNrFrN6G3TW6MGrn4eZUh-htNl1PHpPF6_xpMl4kmha8S8pSMK4FK8rUEChMoWtdK1PmomCU6xoqTmpCqKnKMFdGV7rizJCMEVqzWmRDdH-8e3Dtdw--k5u2d014KTNKCE8Zy9JA0SOlXeu9AyMPzu6V-5WUyKhPRn0y6pMnfSFzd8xYAPjH8zS6y_4AYANs8w</recordid><startdate>2024</startdate><enddate>2024</enddate><creator>Deng, Anping</creator><creator>Chen, Dianbing</creator><creator>Han, Guangliang</creator><creator>Yang, Hang</creator><creator>Liu, Zhichao</creator><creator>Liu, Faxue</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>RIA</scope><scope>RIE</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>7SP</scope><scope>7TG</scope><scope>7UA</scope><scope>8FD</scope><scope>C1K</scope><scope>F1W</scope><scope>FR3</scope><scope>H8D</scope><scope>H96</scope><scope>JQ2</scope><scope>KL.</scope><scope>KR7</scope><scope>L.G</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><orcidid>https://orcid.org/0000-0003-1952-2370</orcidid><orcidid>https://orcid.org/0000-0002-0419-7176</orcidid><orcidid>https://orcid.org/0000-0001-6027-1337</orcidid><orcidid>https://orcid.org/0009-0009-0963-6236</orcidid></search><sort><creationdate>2024</creationdate><title>Separation Fusion Transformer and Efficient Reuse Matching Network for Aerial Tracking</title><author>Deng, Anping ; Chen, Dianbing ; Han, Guangliang ; Yang, Hang ; Liu, Zhichao ; Liu, Faxue</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c176t-99856c85792f0e7f7cdcdaf9487516cdeb60d001fb9cdaafcbcb65f03501d5d83</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Aerial tracking</topic><topic>Algorithms</topic><topic>Aviation</topic><topic>Computer applications</topic><topic>Correlation</topic><topic>Data mining</topic><topic>efficient transformer</topic><topic>Feature extraction</topic><topic>Frames per second</topic><topic>Information processing</topic><topic>Latency</topic><topic>Matching</topic><topic>Network latency</topic><topic>Remote sensing</topic><topic>Separation</topic><topic>Target tracking</topic><topic>Task analysis</topic><topic>Tracking</topic><topic>Tracking networks</topic><topic>Training</topic><topic>Transformers</topic><topic>Unmanned aerial vehicles</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Deng, Anping</creatorcontrib><creatorcontrib>Chen, Dianbing</creatorcontrib><creatorcontrib>Han, Guangliang</creatorcontrib><creatorcontrib>Yang, Hang</creatorcontrib><creatorcontrib>Liu, Zhichao</creatorcontrib><creatorcontrib>Liu, Faxue</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Electronics & Communications Abstracts</collection><collection>Meteorological & Geoastrophysical Abstracts</collection><collection>Water Resources Abstracts</collection><collection>Technology Research Database</collection><collection>Environmental Sciences and Pollution Management</collection><collection>ASFA: Aquatic Sciences and Fisheries Abstracts</collection><collection>Engineering Research Database</collection><collection>Aerospace Database</collection><collection>Aquatic Science & Fisheries Abstracts (ASFA) 2: Ocean Technology, Policy & Non-Living Resources</collection><collection>ProQuest Computer Science Collection</collection><collection>Meteorological & Geoastrophysical Abstracts - Academic</collection><collection>Civil Engineering Abstracts</collection><collection>Aquatic Science & Fisheries Abstracts (ASFA) Professional</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><jtitle>IEEE geoscience and remote sensing letters</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Deng, Anping</au><au>Chen, Dianbing</au><au>Han, Guangliang</au><au>Yang, Hang</au><au>Liu, Zhichao</au><au>Liu, Faxue</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Separation Fusion Transformer and Efficient Reuse Matching Network for Aerial Tracking</atitle><jtitle>IEEE geoscience and remote sensing letters</jtitle><stitle>LGRS</stitle><date>2024</date><risdate>2024</risdate><volume>21</volume><spage>1</spage><epage>5</epage><pages>1-5</pages><issn>1545-598X</issn><eissn>1558-0571</eissn><coden>IGRSBY</coden><abstract>Due to the uniqueness of its perspective and the continuity of tracking, the current utilization of UAV in tracking has demonstrated immense potential in the field of aviation remote sensing. However, constrained by the complexities of the real-world environment during the tracking process and the limited computational capabilities of the onboard computing platform, the existing tracking networks struggle to effectively integrate superior tracking performance with efficient computational speed. Addressing this pivotal issue, we proposed the Separation Fusion Transformer and Efficient Reuse Matching Network for Aerial tracking (SFTrack). Specifically, we designed the separation fusion transformer (SFT), leveraging a meticulously crafted low-latency transformer architecture to extract robust feature information of the target, thereby enhancing the algorithm's ability to distinguish the target from intricate backgrounds. Furthermore, the efficient reuse matching network (ERM) efficiently performs feature matching to accurately determine the target's scale and location. SFTrack exhibits efficient and precise performance in drone tracking tasks. When compared to other drone-to-ground tracking algorithms, SFTrack demonstrates a significant lead in accuracy across multiple datasets, achieving an impressive 54 frames per second (FPS) on embedded platforms. This validates the feasibility and effectiveness of our tracker in practical drone deployments.</abstract><cop>Piscataway</cop><pub>IEEE</pub><doi>10.1109/LGRS.2024.3436846</doi><tpages>5</tpages><orcidid>https://orcid.org/0000-0003-1952-2370</orcidid><orcidid>https://orcid.org/0000-0002-0419-7176</orcidid><orcidid>https://orcid.org/0000-0001-6027-1337</orcidid><orcidid>https://orcid.org/0009-0009-0963-6236</orcidid></addata></record> |
fulltext | fulltext_linktorsrc |
identifier | ISSN: 1545-598X |
ispartof | IEEE geoscience and remote sensing letters, 2024, Vol.21, p.1-5 |
issn | 1545-598X 1558-0571 |
language | eng |
recordid | cdi_proquest_journals_3100625532 |
source | IEEE Electronic Library (IEL) |
subjects | Aerial tracking Algorithms Aviation Computer applications Correlation Data mining efficient transformer Feature extraction Frames per second Information processing Latency Matching Network latency Remote sensing Separation Target tracking Task analysis Tracking Tracking networks Training Transformers Unmanned aerial vehicles |
title | Separation Fusion Transformer and Efficient Reuse Matching Network for Aerial Tracking |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-06T12%3A52%3A03IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_RIE&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Separation%20Fusion%20Transformer%20and%20Efficient%20Reuse%20Matching%20Network%20for%20Aerial%20Tracking&rft.jtitle=IEEE%20geoscience%20and%20remote%20sensing%20letters&rft.au=Deng,%20Anping&rft.date=2024&rft.volume=21&rft.spage=1&rft.epage=5&rft.pages=1-5&rft.issn=1545-598X&rft.eissn=1558-0571&rft.coden=IGRSBY&rft_id=info:doi/10.1109/LGRS.2024.3436846&rft_dat=%3Cproquest_RIE%3E3100625532%3C/proquest_RIE%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=3100625532&rft_id=info:pmid/&rft_ieee_id=10620024&rfr_iscdi=true |