Human Detection Based on Time-Varying Signature on Range-Doppler Diagram Using Deep Neural Networks
We propose the detection of humans using millimeter-wave FMCW radar based on time-varying signatures of range-Doppler diagrams using deep recurrent neural networks (DRNNs). Demand for human detection is increasing for security, surveillance, and search and rescue purposes, recently, with a particula...
Gespeichert in:
Veröffentlicht in: | IEEE geoscience and remote sensing letters 2021-03, Vol.18 (3), p.426-430 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 430 |
---|---|
container_issue | 3 |
container_start_page | 426 |
container_title | IEEE geoscience and remote sensing letters |
container_volume | 18 |
creator | Kim, Youngwook Alnujaim, Ibrahim You, Sungjin Jeong, Byung Jang |
description | We propose the detection of humans using millimeter-wave FMCW radar based on time-varying signatures of range-Doppler diagrams using deep recurrent neural networks (DRNNs). Demand for human detection is increasing for security, surveillance, and search and rescue purposes, recently, with a particular focus on urban areas filled with clutter and moving targets. We suggest the classification of targets based on their signatures in range-Doppler plots with time because the signatures can be consecutively observed. We measure five target types: humans, cars, cyclists, dogs, and road clutter using millimeter-wave FMCW radar that transmits fast chirps at 77 GHz. To maximize the classification accuracy using the time-varying range-Doppler signatures of the targets, we investigate and compare the performance of 2-D-deep convolutional neural networks (DCNN), 3-D-DCNN, and DRNN along with 2-D-DCNN. The DRNN plus 2-D-DCNN showed the best performance, and the classification accuracy yields 99%, with the human classification rate of 100%. |
doi_str_mv | 10.1109/LGRS.2020.2980320 |
format | Article |
fullrecord | <record><control><sourceid>proquest_RIE</sourceid><recordid>TN_cdi_ieee_primary_9043726</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>9043726</ieee_id><sourcerecordid>2493595262</sourcerecordid><originalsourceid>FETCH-LOGICAL-c293t-a7fd77848a98c3101d943f44a218fee02488bf7db4baca41ab48da903566c77e3</originalsourceid><addsrcrecordid>eNo9kEtLw0AUhQdRsFZ_gLgJuE6dZ2ZmqUZboSj0Ie6GSXITUpuHMwnivzexxdU9cL9zHweha4JnhGB9t5yv1jOKKZ5RrTCj-ARNiBAqxEKS01FzEQqtPs7Rhfc7jClXSk5QuugrWwcxdJB2ZVMHD9ZDFgxiU1YQvlv3U9ZFsC6L2na9g7GzsnUBYdy07R5cEJe2cLYKtn4EY4A2eIXe2f1Quu_GffpLdJbbvYerY52i7fPT5nERLt_mL4_3yzClmnWhlXkmpeLKapUygkmmOcs5t5SoHODv4iSXWcITm1pObMJVZjVmIopSKYFN0e1hbuuarx58Z3ZN7-phpaFcM6EFjehAkQOVusZ7B7lpXVkNfxqCzZilGbM0Y5bmmOXguTl4SgD45zXmTNKI_QIUGG_-</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2493595262</pqid></control><display><type>article</type><title>Human Detection Based on Time-Varying Signature on Range-Doppler Diagram Using Deep Neural Networks</title><source>IEEE Electronic Library (IEL)</source><creator>Kim, Youngwook ; Alnujaim, Ibrahim ; You, Sungjin ; Jeong, Byung Jang</creator><creatorcontrib>Kim, Youngwook ; Alnujaim, Ibrahim ; You, Sungjin ; Jeong, Byung Jang</creatorcontrib><description>We propose the detection of humans using millimeter-wave FMCW radar based on time-varying signatures of range-Doppler diagrams using deep recurrent neural networks (DRNNs). Demand for human detection is increasing for security, surveillance, and search and rescue purposes, recently, with a particular focus on urban areas filled with clutter and moving targets. We suggest the classification of targets based on their signatures in range-Doppler plots with time because the signatures can be consecutively observed. We measure five target types: humans, cars, cyclists, dogs, and road clutter using millimeter-wave FMCW radar that transmits fast chirps at 77 GHz. To maximize the classification accuracy using the time-varying range-Doppler signatures of the targets, we investigate and compare the performance of 2-D-deep convolutional neural networks (DCNN), 3-D-DCNN, and DRNN along with 2-D-DCNN. The DRNN plus 2-D-DCNN showed the best performance, and the classification accuracy yields 99%, with the human classification rate of 100%.</description><identifier>ISSN: 1545-598X</identifier><identifier>EISSN: 1558-0571</identifier><identifier>DOI: 10.1109/LGRS.2020.2980320</identifier><identifier>CODEN: IGRSBY</identifier><language>eng</language><publisher>Piscataway: IEEE</publisher><subject>Accuracy ; Artificial neural networks ; Automobiles ; Classification ; Clutter ; Deep convolutional neural networks (DCNN) ; Detection ; Dogs ; Doppler effect ; Doppler sonar ; Feature extraction ; FMCW radar ; human detection ; Millimeter waves ; Moving targets ; Neural networks ; Radar ; Radar detection ; Radar signatures ; range-Doppler diagram ; Recurrent neural networks ; Search and rescue ; Security ; Urban areas</subject><ispartof>IEEE geoscience and remote sensing letters, 2021-03, Vol.18 (3), p.426-430</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2021</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c293t-a7fd77848a98c3101d943f44a218fee02488bf7db4baca41ab48da903566c77e3</citedby><cites>FETCH-LOGICAL-c293t-a7fd77848a98c3101d943f44a218fee02488bf7db4baca41ab48da903566c77e3</cites><orcidid>0000-0002-4067-6254 ; 0000-0003-3606-0593 ; 0000-0001-5610-0631</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/9043726$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>314,780,784,796,27924,27925,54758</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/9043726$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Kim, Youngwook</creatorcontrib><creatorcontrib>Alnujaim, Ibrahim</creatorcontrib><creatorcontrib>You, Sungjin</creatorcontrib><creatorcontrib>Jeong, Byung Jang</creatorcontrib><title>Human Detection Based on Time-Varying Signature on Range-Doppler Diagram Using Deep Neural Networks</title><title>IEEE geoscience and remote sensing letters</title><addtitle>LGRS</addtitle><description>We propose the detection of humans using millimeter-wave FMCW radar based on time-varying signatures of range-Doppler diagrams using deep recurrent neural networks (DRNNs). Demand for human detection is increasing for security, surveillance, and search and rescue purposes, recently, with a particular focus on urban areas filled with clutter and moving targets. We suggest the classification of targets based on their signatures in range-Doppler plots with time because the signatures can be consecutively observed. We measure five target types: humans, cars, cyclists, dogs, and road clutter using millimeter-wave FMCW radar that transmits fast chirps at 77 GHz. To maximize the classification accuracy using the time-varying range-Doppler signatures of the targets, we investigate and compare the performance of 2-D-deep convolutional neural networks (DCNN), 3-D-DCNN, and DRNN along with 2-D-DCNN. The DRNN plus 2-D-DCNN showed the best performance, and the classification accuracy yields 99%, with the human classification rate of 100%.</description><subject>Accuracy</subject><subject>Artificial neural networks</subject><subject>Automobiles</subject><subject>Classification</subject><subject>Clutter</subject><subject>Deep convolutional neural networks (DCNN)</subject><subject>Detection</subject><subject>Dogs</subject><subject>Doppler effect</subject><subject>Doppler sonar</subject><subject>Feature extraction</subject><subject>FMCW radar</subject><subject>human detection</subject><subject>Millimeter waves</subject><subject>Moving targets</subject><subject>Neural networks</subject><subject>Radar</subject><subject>Radar detection</subject><subject>Radar signatures</subject><subject>range-Doppler diagram</subject><subject>Recurrent neural networks</subject><subject>Search and rescue</subject><subject>Security</subject><subject>Urban areas</subject><issn>1545-598X</issn><issn>1558-0571</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2021</creationdate><recordtype>article</recordtype><sourceid>RIE</sourceid><recordid>eNo9kEtLw0AUhQdRsFZ_gLgJuE6dZ2ZmqUZboSj0Ie6GSXITUpuHMwnivzexxdU9cL9zHweha4JnhGB9t5yv1jOKKZ5RrTCj-ARNiBAqxEKS01FzEQqtPs7Rhfc7jClXSk5QuugrWwcxdJB2ZVMHD9ZDFgxiU1YQvlv3U9ZFsC6L2na9g7GzsnUBYdy07R5cEJe2cLYKtn4EY4A2eIXe2f1Quu_GffpLdJbbvYerY52i7fPT5nERLt_mL4_3yzClmnWhlXkmpeLKapUygkmmOcs5t5SoHODv4iSXWcITm1pObMJVZjVmIopSKYFN0e1hbuuarx58Z3ZN7-phpaFcM6EFjehAkQOVusZ7B7lpXVkNfxqCzZilGbM0Y5bmmOXguTl4SgD45zXmTNKI_QIUGG_-</recordid><startdate>20210301</startdate><enddate>20210301</enddate><creator>Kim, Youngwook</creator><creator>Alnujaim, Ibrahim</creator><creator>You, Sungjin</creator><creator>Jeong, Byung Jang</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>RIA</scope><scope>RIE</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>7SP</scope><scope>7TG</scope><scope>7UA</scope><scope>8FD</scope><scope>C1K</scope><scope>F1W</scope><scope>FR3</scope><scope>H8D</scope><scope>H96</scope><scope>JQ2</scope><scope>KL.</scope><scope>KR7</scope><scope>L.G</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><orcidid>https://orcid.org/0000-0002-4067-6254</orcidid><orcidid>https://orcid.org/0000-0003-3606-0593</orcidid><orcidid>https://orcid.org/0000-0001-5610-0631</orcidid></search><sort><creationdate>20210301</creationdate><title>Human Detection Based on Time-Varying Signature on Range-Doppler Diagram Using Deep Neural Networks</title><author>Kim, Youngwook ; Alnujaim, Ibrahim ; You, Sungjin ; Jeong, Byung Jang</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c293t-a7fd77848a98c3101d943f44a218fee02488bf7db4baca41ab48da903566c77e3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2021</creationdate><topic>Accuracy</topic><topic>Artificial neural networks</topic><topic>Automobiles</topic><topic>Classification</topic><topic>Clutter</topic><topic>Deep convolutional neural networks (DCNN)</topic><topic>Detection</topic><topic>Dogs</topic><topic>Doppler effect</topic><topic>Doppler sonar</topic><topic>Feature extraction</topic><topic>FMCW radar</topic><topic>human detection</topic><topic>Millimeter waves</topic><topic>Moving targets</topic><topic>Neural networks</topic><topic>Radar</topic><topic>Radar detection</topic><topic>Radar signatures</topic><topic>range-Doppler diagram</topic><topic>Recurrent neural networks</topic><topic>Search and rescue</topic><topic>Security</topic><topic>Urban areas</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Kim, Youngwook</creatorcontrib><creatorcontrib>Alnujaim, Ibrahim</creatorcontrib><creatorcontrib>You, Sungjin</creatorcontrib><creatorcontrib>Jeong, Byung Jang</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Electronics & Communications Abstracts</collection><collection>Meteorological & Geoastrophysical Abstracts</collection><collection>Water Resources Abstracts</collection><collection>Technology Research Database</collection><collection>Environmental Sciences and Pollution Management</collection><collection>ASFA: Aquatic Sciences and Fisheries Abstracts</collection><collection>Engineering Research Database</collection><collection>Aerospace Database</collection><collection>Aquatic Science & Fisheries Abstracts (ASFA) 2: Ocean Technology, Policy & Non-Living Resources</collection><collection>ProQuest Computer Science Collection</collection><collection>Meteorological & Geoastrophysical Abstracts - Academic</collection><collection>Civil Engineering Abstracts</collection><collection>Aquatic Science & Fisheries Abstracts (ASFA) Professional</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><jtitle>IEEE geoscience and remote sensing letters</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Kim, Youngwook</au><au>Alnujaim, Ibrahim</au><au>You, Sungjin</au><au>Jeong, Byung Jang</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Human Detection Based on Time-Varying Signature on Range-Doppler Diagram Using Deep Neural Networks</atitle><jtitle>IEEE geoscience and remote sensing letters</jtitle><stitle>LGRS</stitle><date>2021-03-01</date><risdate>2021</risdate><volume>18</volume><issue>3</issue><spage>426</spage><epage>430</epage><pages>426-430</pages><issn>1545-598X</issn><eissn>1558-0571</eissn><coden>IGRSBY</coden><abstract>We propose the detection of humans using millimeter-wave FMCW radar based on time-varying signatures of range-Doppler diagrams using deep recurrent neural networks (DRNNs). Demand for human detection is increasing for security, surveillance, and search and rescue purposes, recently, with a particular focus on urban areas filled with clutter and moving targets. We suggest the classification of targets based on their signatures in range-Doppler plots with time because the signatures can be consecutively observed. We measure five target types: humans, cars, cyclists, dogs, and road clutter using millimeter-wave FMCW radar that transmits fast chirps at 77 GHz. To maximize the classification accuracy using the time-varying range-Doppler signatures of the targets, we investigate and compare the performance of 2-D-deep convolutional neural networks (DCNN), 3-D-DCNN, and DRNN along with 2-D-DCNN. The DRNN plus 2-D-DCNN showed the best performance, and the classification accuracy yields 99%, with the human classification rate of 100%.</abstract><cop>Piscataway</cop><pub>IEEE</pub><doi>10.1109/LGRS.2020.2980320</doi><tpages>5</tpages><orcidid>https://orcid.org/0000-0002-4067-6254</orcidid><orcidid>https://orcid.org/0000-0003-3606-0593</orcidid><orcidid>https://orcid.org/0000-0001-5610-0631</orcidid></addata></record> |
fulltext | fulltext_linktorsrc |
identifier | ISSN: 1545-598X |
ispartof | IEEE geoscience and remote sensing letters, 2021-03, Vol.18 (3), p.426-430 |
issn | 1545-598X 1558-0571 |
language | eng |
recordid | cdi_ieee_primary_9043726 |
source | IEEE Electronic Library (IEL) |
subjects | Accuracy Artificial neural networks Automobiles Classification Clutter Deep convolutional neural networks (DCNN) Detection Dogs Doppler effect Doppler sonar Feature extraction FMCW radar human detection Millimeter waves Moving targets Neural networks Radar Radar detection Radar signatures range-Doppler diagram Recurrent neural networks Search and rescue Security Urban areas |
title | Human Detection Based on Time-Varying Signature on Range-Doppler Diagram Using Deep Neural Networks |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-03T00%3A54%3A11IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_RIE&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Human%20Detection%20Based%20on%20Time-Varying%20Signature%20on%20Range-Doppler%20Diagram%20Using%20Deep%20Neural%20Networks&rft.jtitle=IEEE%20geoscience%20and%20remote%20sensing%20letters&rft.au=Kim,%20Youngwook&rft.date=2021-03-01&rft.volume=18&rft.issue=3&rft.spage=426&rft.epage=430&rft.pages=426-430&rft.issn=1545-598X&rft.eissn=1558-0571&rft.coden=IGRSBY&rft_id=info:doi/10.1109/LGRS.2020.2980320&rft_dat=%3Cproquest_RIE%3E2493595262%3C/proquest_RIE%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2493595262&rft_id=info:pmid/&rft_ieee_id=9043726&rfr_iscdi=true |