Multi-Feature Fusion in Particle Filter Framework for Visual Tracking
In this article, a particle filter based tracking algorithm is proposed to track a target in video with vivid and complex environments. The target is represented in feature space by both color distribution and KAZE features. Color distribution is selected for its robustness to target's scale va...
Gespeichert in:
Veröffentlicht in: | IEEE sensors journal 2020-03, Vol.20 (5), p.2405-2415 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 2415 |
---|---|
container_issue | 5 |
container_start_page | 2405 |
container_title | IEEE sensors journal |
container_volume | 20 |
creator | Bhat, Pranab Gajanan Subudhi, Badri Narayan Veerakumar, T. Laxmi, Vijay Gaur, Manoj Singh |
description | In this article, a particle filter based tracking algorithm is proposed to track a target in video with vivid and complex environments. The target is represented in feature space by both color distribution and KAZE features. Color distribution is selected for its robustness to target's scale variation and partial occlusion. KAZE features are chosen for their ability to represent the target structure and also for their superior performance in feature matching. Fusion of these two features will lead to effective tracking as compared to other features due to their better representational abilities, under challenging conditions. The trajectory of the target is established using the particle filter algorithm based on similarity between the extracted features from the target and the probable candidates in the consecutive frames. For the color distribution model, Bhattacharya coefficient is used as a similarity metric whereas Nearest Neighbor Distance Ratio is used for matching of corresponding feature points in KAZE algorithm. The particle filter update model is based on kinematic motion equations and the weights on particles are governed by an equation fusing both the color and KAZE features. Centre Location Error, Average Tracking Accuracy and Tracking Success Rate are the performance metrics considered in the evaluation process. Also, the overlap success plot and precision plot is considered for performance evaluation. On the basis of these metrics and visual results obtained under different environment conditions: outdoor, occluding and underwater ones, the proposed tracking scheme performs significantly better than the contemporary feature-based iterative object tracking methods and even few of the learning-based algorithms. |
doi_str_mv | 10.1109/JSEN.2019.2954331 |
format | Article |
fullrecord | <record><control><sourceid>proquest_RIE</sourceid><recordid>TN_cdi_proquest_journals_2352194689</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>8906128</ieee_id><sourcerecordid>2352194689</sourcerecordid><originalsourceid>FETCH-LOGICAL-c293t-af477dbce30c55cf776d1ce34aa73684d82698ec7fa85a96fdd50ae4b6c041c13</originalsourceid><addsrcrecordid>eNo9kNlKw0AUhgdRsFYfQLwJeJ04-3IppXGhLmAV74bpZEamTZM6kyC-vQktXp2F7z8HPgAuESwQgurm8W3-XGCIVIEVo4SgIzBBjMkcCSqPx57AnBLxeQrOUlrDgRRMTMD8qa-7kJfOdH10Wdmn0DZZaLJXE7tg62EV6s7FrIxm637auMl8G7OPkHpTZ8to7CY0X-fgxJs6uYtDnYL3cr6c3eeLl7uH2e0it1iRLjeeClGtrCPQMma9ELxCw0SNEYRLWknMlXRWeCOZUdxXFYPG0RW3kCKLyBRc7-_uYvvdu9TpddvHZnipMWEYKcqlGii0p2xsU4rO610MWxN_NYJ6tKVHW3q0pQ-2hszVPhOcc_-8VJAjLMkfQZVmDg</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2352194689</pqid></control><display><type>article</type><title>Multi-Feature Fusion in Particle Filter Framework for Visual Tracking</title><source>IEEE Electronic Library (IEL)</source><creator>Bhat, Pranab Gajanan ; Subudhi, Badri Narayan ; Veerakumar, T. ; Laxmi, Vijay ; Gaur, Manoj Singh</creator><creatorcontrib>Bhat, Pranab Gajanan ; Subudhi, Badri Narayan ; Veerakumar, T. ; Laxmi, Vijay ; Gaur, Manoj Singh</creatorcontrib><description>In this article, a particle filter based tracking algorithm is proposed to track a target in video with vivid and complex environments. The target is represented in feature space by both color distribution and KAZE features. Color distribution is selected for its robustness to target's scale variation and partial occlusion. KAZE features are chosen for their ability to represent the target structure and also for their superior performance in feature matching. Fusion of these two features will lead to effective tracking as compared to other features due to their better representational abilities, under challenging conditions. The trajectory of the target is established using the particle filter algorithm based on similarity between the extracted features from the target and the probable candidates in the consecutive frames. For the color distribution model, Bhattacharya coefficient is used as a similarity metric whereas Nearest Neighbor Distance Ratio is used for matching of corresponding feature points in KAZE algorithm. The particle filter update model is based on kinematic motion equations and the weights on particles are governed by an equation fusing both the color and KAZE features. Centre Location Error, Average Tracking Accuracy and Tracking Success Rate are the performance metrics considered in the evaluation process. Also, the overlap success plot and precision plot is considered for performance evaluation. On the basis of these metrics and visual results obtained under different environment conditions: outdoor, occluding and underwater ones, the proposed tracking scheme performs significantly better than the contemporary feature-based iterative object tracking methods and even few of the learning-based algorithms.</description><identifier>ISSN: 1530-437X</identifier><identifier>EISSN: 1558-1748</identifier><identifier>DOI: 10.1109/JSEN.2019.2954331</identifier><identifier>CODEN: ISJEAZ</identifier><language>eng</language><publisher>New York: IEEE</publisher><subject>Algorithms ; Color ; Equations of motion ; Feature extraction ; fusion ; Image color analysis ; Iterative methods ; KAZE ; Kinematics ; Machine learning ; Matching ; Mathematical model ; Occlusion ; Optical tracking ; particle filter ; Performance evaluation ; Performance measurement ; Similarity ; Target tracking ; Visual tracking ; Visualization</subject><ispartof>IEEE sensors journal, 2020-03, Vol.20 (5), p.2405-2415</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2020</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c293t-af477dbce30c55cf776d1ce34aa73684d82698ec7fa85a96fdd50ae4b6c041c13</citedby><cites>FETCH-LOGICAL-c293t-af477dbce30c55cf776d1ce34aa73684d82698ec7fa85a96fdd50ae4b6c041c13</cites><orcidid>0000-0003-1464-0757 ; 0000-0002-0497-721X ; 0000-0002-4378-0065 ; 0000-0001-9084-1847</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/8906128$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>314,780,784,796,27924,27925,54758</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/8906128$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Bhat, Pranab Gajanan</creatorcontrib><creatorcontrib>Subudhi, Badri Narayan</creatorcontrib><creatorcontrib>Veerakumar, T.</creatorcontrib><creatorcontrib>Laxmi, Vijay</creatorcontrib><creatorcontrib>Gaur, Manoj Singh</creatorcontrib><title>Multi-Feature Fusion in Particle Filter Framework for Visual Tracking</title><title>IEEE sensors journal</title><addtitle>JSEN</addtitle><description>In this article, a particle filter based tracking algorithm is proposed to track a target in video with vivid and complex environments. The target is represented in feature space by both color distribution and KAZE features. Color distribution is selected for its robustness to target's scale variation and partial occlusion. KAZE features are chosen for their ability to represent the target structure and also for their superior performance in feature matching. Fusion of these two features will lead to effective tracking as compared to other features due to their better representational abilities, under challenging conditions. The trajectory of the target is established using the particle filter algorithm based on similarity between the extracted features from the target and the probable candidates in the consecutive frames. For the color distribution model, Bhattacharya coefficient is used as a similarity metric whereas Nearest Neighbor Distance Ratio is used for matching of corresponding feature points in KAZE algorithm. The particle filter update model is based on kinematic motion equations and the weights on particles are governed by an equation fusing both the color and KAZE features. Centre Location Error, Average Tracking Accuracy and Tracking Success Rate are the performance metrics considered in the evaluation process. Also, the overlap success plot and precision plot is considered for performance evaluation. On the basis of these metrics and visual results obtained under different environment conditions: outdoor, occluding and underwater ones, the proposed tracking scheme performs significantly better than the contemporary feature-based iterative object tracking methods and even few of the learning-based algorithms.</description><subject>Algorithms</subject><subject>Color</subject><subject>Equations of motion</subject><subject>Feature extraction</subject><subject>fusion</subject><subject>Image color analysis</subject><subject>Iterative methods</subject><subject>KAZE</subject><subject>Kinematics</subject><subject>Machine learning</subject><subject>Matching</subject><subject>Mathematical model</subject><subject>Occlusion</subject><subject>Optical tracking</subject><subject>particle filter</subject><subject>Performance evaluation</subject><subject>Performance measurement</subject><subject>Similarity</subject><subject>Target tracking</subject><subject>Visual tracking</subject><subject>Visualization</subject><issn>1530-437X</issn><issn>1558-1748</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2020</creationdate><recordtype>article</recordtype><sourceid>RIE</sourceid><recordid>eNo9kNlKw0AUhgdRsFYfQLwJeJ04-3IppXGhLmAV74bpZEamTZM6kyC-vQktXp2F7z8HPgAuESwQgurm8W3-XGCIVIEVo4SgIzBBjMkcCSqPx57AnBLxeQrOUlrDgRRMTMD8qa-7kJfOdH10Wdmn0DZZaLJXE7tg62EV6s7FrIxm637auMl8G7OPkHpTZ8to7CY0X-fgxJs6uYtDnYL3cr6c3eeLl7uH2e0it1iRLjeeClGtrCPQMma9ELxCw0SNEYRLWknMlXRWeCOZUdxXFYPG0RW3kCKLyBRc7-_uYvvdu9TpddvHZnipMWEYKcqlGii0p2xsU4rO610MWxN_NYJ6tKVHW3q0pQ-2hszVPhOcc_-8VJAjLMkfQZVmDg</recordid><startdate>20200301</startdate><enddate>20200301</enddate><creator>Bhat, Pranab Gajanan</creator><creator>Subudhi, Badri Narayan</creator><creator>Veerakumar, T.</creator><creator>Laxmi, Vijay</creator><creator>Gaur, Manoj Singh</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>RIA</scope><scope>RIE</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SP</scope><scope>7U5</scope><scope>8FD</scope><scope>L7M</scope><orcidid>https://orcid.org/0000-0003-1464-0757</orcidid><orcidid>https://orcid.org/0000-0002-0497-721X</orcidid><orcidid>https://orcid.org/0000-0002-4378-0065</orcidid><orcidid>https://orcid.org/0000-0001-9084-1847</orcidid></search><sort><creationdate>20200301</creationdate><title>Multi-Feature Fusion in Particle Filter Framework for Visual Tracking</title><author>Bhat, Pranab Gajanan ; Subudhi, Badri Narayan ; Veerakumar, T. ; Laxmi, Vijay ; Gaur, Manoj Singh</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c293t-af477dbce30c55cf776d1ce34aa73684d82698ec7fa85a96fdd50ae4b6c041c13</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2020</creationdate><topic>Algorithms</topic><topic>Color</topic><topic>Equations of motion</topic><topic>Feature extraction</topic><topic>fusion</topic><topic>Image color analysis</topic><topic>Iterative methods</topic><topic>KAZE</topic><topic>Kinematics</topic><topic>Machine learning</topic><topic>Matching</topic><topic>Mathematical model</topic><topic>Occlusion</topic><topic>Optical tracking</topic><topic>particle filter</topic><topic>Performance evaluation</topic><topic>Performance measurement</topic><topic>Similarity</topic><topic>Target tracking</topic><topic>Visual tracking</topic><topic>Visualization</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Bhat, Pranab Gajanan</creatorcontrib><creatorcontrib>Subudhi, Badri Narayan</creatorcontrib><creatorcontrib>Veerakumar, T.</creatorcontrib><creatorcontrib>Laxmi, Vijay</creatorcontrib><creatorcontrib>Gaur, Manoj Singh</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>CrossRef</collection><collection>Electronics & Communications Abstracts</collection><collection>Solid State and Superconductivity Abstracts</collection><collection>Technology Research Database</collection><collection>Advanced Technologies Database with Aerospace</collection><jtitle>IEEE sensors journal</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Bhat, Pranab Gajanan</au><au>Subudhi, Badri Narayan</au><au>Veerakumar, T.</au><au>Laxmi, Vijay</au><au>Gaur, Manoj Singh</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Multi-Feature Fusion in Particle Filter Framework for Visual Tracking</atitle><jtitle>IEEE sensors journal</jtitle><stitle>JSEN</stitle><date>2020-03-01</date><risdate>2020</risdate><volume>20</volume><issue>5</issue><spage>2405</spage><epage>2415</epage><pages>2405-2415</pages><issn>1530-437X</issn><eissn>1558-1748</eissn><coden>ISJEAZ</coden><abstract>In this article, a particle filter based tracking algorithm is proposed to track a target in video with vivid and complex environments. The target is represented in feature space by both color distribution and KAZE features. Color distribution is selected for its robustness to target's scale variation and partial occlusion. KAZE features are chosen for their ability to represent the target structure and also for their superior performance in feature matching. Fusion of these two features will lead to effective tracking as compared to other features due to their better representational abilities, under challenging conditions. The trajectory of the target is established using the particle filter algorithm based on similarity between the extracted features from the target and the probable candidates in the consecutive frames. For the color distribution model, Bhattacharya coefficient is used as a similarity metric whereas Nearest Neighbor Distance Ratio is used for matching of corresponding feature points in KAZE algorithm. The particle filter update model is based on kinematic motion equations and the weights on particles are governed by an equation fusing both the color and KAZE features. Centre Location Error, Average Tracking Accuracy and Tracking Success Rate are the performance metrics considered in the evaluation process. Also, the overlap success plot and precision plot is considered for performance evaluation. On the basis of these metrics and visual results obtained under different environment conditions: outdoor, occluding and underwater ones, the proposed tracking scheme performs significantly better than the contemporary feature-based iterative object tracking methods and even few of the learning-based algorithms.</abstract><cop>New York</cop><pub>IEEE</pub><doi>10.1109/JSEN.2019.2954331</doi><tpages>11</tpages><orcidid>https://orcid.org/0000-0003-1464-0757</orcidid><orcidid>https://orcid.org/0000-0002-0497-721X</orcidid><orcidid>https://orcid.org/0000-0002-4378-0065</orcidid><orcidid>https://orcid.org/0000-0001-9084-1847</orcidid></addata></record> |
fulltext | fulltext_linktorsrc |
identifier | ISSN: 1530-437X |
ispartof | IEEE sensors journal, 2020-03, Vol.20 (5), p.2405-2415 |
issn | 1530-437X 1558-1748 |
language | eng |
recordid | cdi_proquest_journals_2352194689 |
source | IEEE Electronic Library (IEL) |
subjects | Algorithms Color Equations of motion Feature extraction fusion Image color analysis Iterative methods KAZE Kinematics Machine learning Matching Mathematical model Occlusion Optical tracking particle filter Performance evaluation Performance measurement Similarity Target tracking Visual tracking Visualization |
title | Multi-Feature Fusion in Particle Filter Framework for Visual Tracking |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-24T22%3A39%3A59IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_RIE&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Multi-Feature%20Fusion%20in%20Particle%20Filter%20Framework%20for%20Visual%20Tracking&rft.jtitle=IEEE%20sensors%20journal&rft.au=Bhat,%20Pranab%20Gajanan&rft.date=2020-03-01&rft.volume=20&rft.issue=5&rft.spage=2405&rft.epage=2415&rft.pages=2405-2415&rft.issn=1530-437X&rft.eissn=1558-1748&rft.coden=ISJEAZ&rft_id=info:doi/10.1109/JSEN.2019.2954331&rft_dat=%3Cproquest_RIE%3E2352194689%3C/proquest_RIE%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2352194689&rft_id=info:pmid/&rft_ieee_id=8906128&rfr_iscdi=true |