Efficient Through-Wall Human Pose Reconstruction Using UWB MIMO Radar

In this letter, we introduce ultrawideband (UWB) Pose, a through-wall human pose reconstruction framework using UWB multiple-input--multiple-output (MIMO) radar. Previous radio frequency-based works have achieved two-dimensional (2-D) and 3-D human pose reconstruction in human detection of occlusion...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE antennas and wireless propagation letters 2022-03, Vol.21 (3), p.571-575
Hauptverfasser: Song, Yongkun, Jin, Tian, Dai, Yongpeng, Zhou, Xiaolong
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 575
container_issue 3
container_start_page 571
container_title IEEE antennas and wireless propagation letters
container_volume 21
creator Song, Yongkun
Jin, Tian
Dai, Yongpeng
Zhou, Xiaolong
description In this letter, we introduce ultrawideband (UWB) Pose, a through-wall human pose reconstruction framework using UWB multiple-input--multiple-output (MIMO) radar. Previous radio frequency-based works have achieved two-dimensional (2-D) and 3-D human pose reconstruction in human detection of occlusion scenes. However, these methods are difficult to adapt to the environment and often suffer from high computational complexity. To address this issue, we first construct 3-D radar images of human targets using UWB MIMO radar system, and transform those radar images into discrete 3-D point data. Then, design a lightweight deep learning network to extract human body features from the input point data, finally convert the features into 3-D pose coordinates. The comparative experimental results show that the human pose reconstruction error of our UWB-Pose framework can be as low as 38.84 mm. Importantly, the number of parameters and floating point operations are further reduced to 1.02 M and 2.75 G, to meet the needs in practical deployments and the scope of applicability.
doi_str_mv 10.1109/LAWP.2021.3138512
format Article
fullrecord <record><control><sourceid>proquest_RIE</sourceid><recordid>TN_cdi_proquest_journals_2635045789</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>9664385</ieee_id><sourcerecordid>2635045789</sourcerecordid><originalsourceid>FETCH-LOGICAL-c293t-21542385c73edad923eff0144233615f78be0a40bf7b9fbf5285073e4892bf963</originalsourceid><addsrcrecordid>eNo9kE9rAjEQxUNpodb2A5ReAj2vzZ9NsjlasVVQFFE8huxuoivrxia7h377ZlF6mmHmvZnHD4BXjEYYI_mxGO_XI4IIHlFMM4bJHRhglmYJE0zc9z3lCSaEPYKnEE4IYcEZHYDp1NqqqEzTwu3Ru-5wTPa6ruGsO-sGrl0wcGMK14TWd0VbuQbuQtUc4G7_CZfz5QpudKn9M3iwug7m5VaHYPc13U5myWL1PZ-MF0lBJG0TEhORmK4Q1JS6lIQaaxFO45ByzKzIcoN0inIrcmlzy0jGUNSmmSS5lZwOwfv17sW7n86EVp1c55v4UhFOGUqZyGRU4auq8C4Eb6y6-Oqs_a_CSPW0VE9L9bTUjVb0vF09lTHmXy85T-Oe_gFss2O1</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2635045789</pqid></control><display><type>article</type><title>Efficient Through-Wall Human Pose Reconstruction Using UWB MIMO Radar</title><source>IEEE Electronic Library (IEL)</source><creator>Song, Yongkun ; Jin, Tian ; Dai, Yongpeng ; Zhou, Xiaolong</creator><creatorcontrib>Song, Yongkun ; Jin, Tian ; Dai, Yongpeng ; Zhou, Xiaolong</creatorcontrib><description>In this letter, we introduce ultrawideband (UWB) Pose, a through-wall human pose reconstruction framework using UWB multiple-input--multiple-output (MIMO) radar. Previous radio frequency-based works have achieved two-dimensional (2-D) and 3-D human pose reconstruction in human detection of occlusion scenes. However, these methods are difficult to adapt to the environment and often suffer from high computational complexity. To address this issue, we first construct 3-D radar images of human targets using UWB MIMO radar system, and transform those radar images into discrete 3-D point data. Then, design a lightweight deep learning network to extract human body features from the input point data, finally convert the features into 3-D pose coordinates. The comparative experimental results show that the human pose reconstruction error of our UWB-Pose framework can be as low as 38.84 mm. Importantly, the number of parameters and floating point operations are further reduced to 1.02 M and 2.75 G, to meet the needs in practical deployments and the scope of applicability.</description><identifier>ISSN: 1536-1225</identifier><identifier>EISSN: 1548-5757</identifier><identifier>DOI: 10.1109/LAWP.2021.3138512</identifier><identifier>CODEN: IAWPA7</identifier><language>eng</language><publisher>New York: IEEE</publisher><subject>Antenna arrays ; Deep learning ; Feature extraction ; Floating point arithmetic ; human pose reconstruction ; Image reconstruction ; Occlusion ; Radar ; Radar antennas ; Radar equipment ; Radar imaging ; Three-dimensional displays ; through-wall ; Transmitting antennas ; ultrawideband (UWB) radar ; Ultrawideband radar</subject><ispartof>IEEE antennas and wireless propagation letters, 2022-03, Vol.21 (3), p.571-575</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2022</rights><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c293t-21542385c73edad923eff0144233615f78be0a40bf7b9fbf5285073e4892bf963</citedby><cites>FETCH-LOGICAL-c293t-21542385c73edad923eff0144233615f78be0a40bf7b9fbf5285073e4892bf963</cites><orcidid>0000-0001-7592-2315 ; 0000-0002-0734-9833 ; 0000-0002-4142-6265</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/9664385$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>314,776,780,792,27903,27904,54736</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/9664385$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Song, Yongkun</creatorcontrib><creatorcontrib>Jin, Tian</creatorcontrib><creatorcontrib>Dai, Yongpeng</creatorcontrib><creatorcontrib>Zhou, Xiaolong</creatorcontrib><title>Efficient Through-Wall Human Pose Reconstruction Using UWB MIMO Radar</title><title>IEEE antennas and wireless propagation letters</title><addtitle>LAWP</addtitle><description>In this letter, we introduce ultrawideband (UWB) Pose, a through-wall human pose reconstruction framework using UWB multiple-input--multiple-output (MIMO) radar. Previous radio frequency-based works have achieved two-dimensional (2-D) and 3-D human pose reconstruction in human detection of occlusion scenes. However, these methods are difficult to adapt to the environment and often suffer from high computational complexity. To address this issue, we first construct 3-D radar images of human targets using UWB MIMO radar system, and transform those radar images into discrete 3-D point data. Then, design a lightweight deep learning network to extract human body features from the input point data, finally convert the features into 3-D pose coordinates. The comparative experimental results show that the human pose reconstruction error of our UWB-Pose framework can be as low as 38.84 mm. Importantly, the number of parameters and floating point operations are further reduced to 1.02 M and 2.75 G, to meet the needs in practical deployments and the scope of applicability.</description><subject>Antenna arrays</subject><subject>Deep learning</subject><subject>Feature extraction</subject><subject>Floating point arithmetic</subject><subject>human pose reconstruction</subject><subject>Image reconstruction</subject><subject>Occlusion</subject><subject>Radar</subject><subject>Radar antennas</subject><subject>Radar equipment</subject><subject>Radar imaging</subject><subject>Three-dimensional displays</subject><subject>through-wall</subject><subject>Transmitting antennas</subject><subject>ultrawideband (UWB) radar</subject><subject>Ultrawideband radar</subject><issn>1536-1225</issn><issn>1548-5757</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2022</creationdate><recordtype>article</recordtype><sourceid>RIE</sourceid><recordid>eNo9kE9rAjEQxUNpodb2A5ReAj2vzZ9NsjlasVVQFFE8huxuoivrxia7h377ZlF6mmHmvZnHD4BXjEYYI_mxGO_XI4IIHlFMM4bJHRhglmYJE0zc9z3lCSaEPYKnEE4IYcEZHYDp1NqqqEzTwu3Ru-5wTPa6ruGsO-sGrl0wcGMK14TWd0VbuQbuQtUc4G7_CZfz5QpudKn9M3iwug7m5VaHYPc13U5myWL1PZ-MF0lBJG0TEhORmK4Q1JS6lIQaaxFO45ByzKzIcoN0inIrcmlzy0jGUNSmmSS5lZwOwfv17sW7n86EVp1c55v4UhFOGUqZyGRU4auq8C4Eb6y6-Oqs_a_CSPW0VE9L9bTUjVb0vF09lTHmXy85T-Oe_gFss2O1</recordid><startdate>20220301</startdate><enddate>20220301</enddate><creator>Song, Yongkun</creator><creator>Jin, Tian</creator><creator>Dai, Yongpeng</creator><creator>Zhou, Xiaolong</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>RIA</scope><scope>RIE</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SP</scope><scope>8FD</scope><scope>L7M</scope><orcidid>https://orcid.org/0000-0001-7592-2315</orcidid><orcidid>https://orcid.org/0000-0002-0734-9833</orcidid><orcidid>https://orcid.org/0000-0002-4142-6265</orcidid></search><sort><creationdate>20220301</creationdate><title>Efficient Through-Wall Human Pose Reconstruction Using UWB MIMO Radar</title><author>Song, Yongkun ; Jin, Tian ; Dai, Yongpeng ; Zhou, Xiaolong</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c293t-21542385c73edad923eff0144233615f78be0a40bf7b9fbf5285073e4892bf963</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2022</creationdate><topic>Antenna arrays</topic><topic>Deep learning</topic><topic>Feature extraction</topic><topic>Floating point arithmetic</topic><topic>human pose reconstruction</topic><topic>Image reconstruction</topic><topic>Occlusion</topic><topic>Radar</topic><topic>Radar antennas</topic><topic>Radar equipment</topic><topic>Radar imaging</topic><topic>Three-dimensional displays</topic><topic>through-wall</topic><topic>Transmitting antennas</topic><topic>ultrawideband (UWB) radar</topic><topic>Ultrawideband radar</topic><toplevel>online_resources</toplevel><creatorcontrib>Song, Yongkun</creatorcontrib><creatorcontrib>Jin, Tian</creatorcontrib><creatorcontrib>Dai, Yongpeng</creatorcontrib><creatorcontrib>Zhou, Xiaolong</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>CrossRef</collection><collection>Electronics &amp; Communications Abstracts</collection><collection>Technology Research Database</collection><collection>Advanced Technologies Database with Aerospace</collection><jtitle>IEEE antennas and wireless propagation letters</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Song, Yongkun</au><au>Jin, Tian</au><au>Dai, Yongpeng</au><au>Zhou, Xiaolong</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Efficient Through-Wall Human Pose Reconstruction Using UWB MIMO Radar</atitle><jtitle>IEEE antennas and wireless propagation letters</jtitle><stitle>LAWP</stitle><date>2022-03-01</date><risdate>2022</risdate><volume>21</volume><issue>3</issue><spage>571</spage><epage>575</epage><pages>571-575</pages><issn>1536-1225</issn><eissn>1548-5757</eissn><coden>IAWPA7</coden><abstract>In this letter, we introduce ultrawideband (UWB) Pose, a through-wall human pose reconstruction framework using UWB multiple-input--multiple-output (MIMO) radar. Previous radio frequency-based works have achieved two-dimensional (2-D) and 3-D human pose reconstruction in human detection of occlusion scenes. However, these methods are difficult to adapt to the environment and often suffer from high computational complexity. To address this issue, we first construct 3-D radar images of human targets using UWB MIMO radar system, and transform those radar images into discrete 3-D point data. Then, design a lightweight deep learning network to extract human body features from the input point data, finally convert the features into 3-D pose coordinates. The comparative experimental results show that the human pose reconstruction error of our UWB-Pose framework can be as low as 38.84 mm. Importantly, the number of parameters and floating point operations are further reduced to 1.02 M and 2.75 G, to meet the needs in practical deployments and the scope of applicability.</abstract><cop>New York</cop><pub>IEEE</pub><doi>10.1109/LAWP.2021.3138512</doi><tpages>5</tpages><orcidid>https://orcid.org/0000-0001-7592-2315</orcidid><orcidid>https://orcid.org/0000-0002-0734-9833</orcidid><orcidid>https://orcid.org/0000-0002-4142-6265</orcidid></addata></record>
fulltext fulltext_linktorsrc
identifier ISSN: 1536-1225
ispartof IEEE antennas and wireless propagation letters, 2022-03, Vol.21 (3), p.571-575
issn 1536-1225
1548-5757
language eng
recordid cdi_proquest_journals_2635045789
source IEEE Electronic Library (IEL)
subjects Antenna arrays
Deep learning
Feature extraction
Floating point arithmetic
human pose reconstruction
Image reconstruction
Occlusion
Radar
Radar antennas
Radar equipment
Radar imaging
Three-dimensional displays
through-wall
Transmitting antennas
ultrawideband (UWB) radar
Ultrawideband radar
title Efficient Through-Wall Human Pose Reconstruction Using UWB MIMO Radar
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-25T02%3A23%3A29IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_RIE&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Efficient%20Through-Wall%20Human%20Pose%20Reconstruction%20Using%20UWB%20MIMO%20Radar&rft.jtitle=IEEE%20antennas%20and%20wireless%20propagation%20letters&rft.au=Song,%20Yongkun&rft.date=2022-03-01&rft.volume=21&rft.issue=3&rft.spage=571&rft.epage=575&rft.pages=571-575&rft.issn=1536-1225&rft.eissn=1548-5757&rft.coden=IAWPA7&rft_id=info:doi/10.1109/LAWP.2021.3138512&rft_dat=%3Cproquest_RIE%3E2635045789%3C/proquest_RIE%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2635045789&rft_id=info:pmid/&rft_ieee_id=9664385&rfr_iscdi=true