An FCNN-Based Super-Resolution Mmwave Radar Framework for Contactless Musical Instrument Interface

In this article, we propose a framework for contactless human-computer interaction (HCI) using novel tracking techniques based on deep learning-based super-resolution and tracking algorithms. Our system offers unprecedented high-resolution tracking of hand position and motion characteristics by leve...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on multimedia 2022, Vol.24, p.2315-2328
Hauptverfasser: W. Smith, Josiah, Furxhi, Orges, Torlak, Murat
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 2328
container_issue
container_start_page 2315
container_title IEEE transactions on multimedia
container_volume 24
creator W. Smith, Josiah
Furxhi, Orges
Torlak, Murat
description In this article, we propose a framework for contactless human-computer interaction (HCI) using novel tracking techniques based on deep learning-based super-resolution and tracking algorithms. Our system offers unprecedented high-resolution tracking of hand position and motion characteristics by leveraging spatial and temporal features embedded in the reflected radar waveform. Rather than classifying samples from a predefined set of hand gestures, as common in existing work on deep learning with mmWave radar, our proposed imager employs a regressive full convolutional neural network (FCNN) approach to improve localization accuracy by spatial super-resolution. While the proposed techniques are suitable for a host of tracking applications, this article focuses on their application as a musical interface to demonstrate the robustness of the gesture sensing pipeline and deep learning signal processing chain. The user can control the instrument by varying the position and velocity of their hand above the vertically-facing sensor. By employing a commercially available multiple-input-multiple-output (MIMO) radar rather than a traditional optical sensor, our framework demonstrates the efficacy of the mmWave sensing modality for fine motion tracking and offers an elegant solution to a host of HCI tasks. Additionally, we provide a freely available software package and user interface for controlling the device, streaming the data to MATLAB in real-time, and increasing accessibility to the signal processing and device interface functionality utilized in this article.
doi_str_mv 10.1109/TMM.2021.3079695
format Article
fullrecord <record><control><sourceid>proquest_RIE</sourceid><recordid>TN_cdi_ieee_primary_9429975</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>9429975</ieee_id><sourcerecordid>2662094305</sourcerecordid><originalsourceid>FETCH-LOGICAL-c333t-33e8e7624fa3de6c0a4a92ff0026f1e0669032eab23a5e8472f05226670311963</originalsourceid><addsrcrecordid>eNo9kMFLwzAUxosoOKd3wUvAc-dL0ibLcQ6ng3XCnOeQdS_Q2TYzSR3-93ZseHrf4fd9D35Jck9hRCmop3VRjBgwOuIglVD5RTKgKqMpgJSXfc4ZpIpRuE5uQtgB0CwHOUg2k5bMpstl-mwCbslHt0efrjC4uouVa0nRHMwPkpXZGk9m3jR4cP6LWOfJ1LXRlLHGEEjRhao0NZm3IfquwTb2MaK3psTb5MqaOuDd-Q6Tz9nLevqWLt5f59PJIi055zHlHMcoBcus4VsUJZjMKGYtABOWIgihgDM0G8ZNjuNMMgs5Y0JI4JQqwYfJ42l37913hyHqnet827_UPcVAZRzynoITVXoXgker975qjP_VFPTRpO5N6qNJfTbZVx5OlQoR_3GVMaVkzv8AC3ZuEA</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2662094305</pqid></control><display><type>article</type><title>An FCNN-Based Super-Resolution Mmwave Radar Framework for Contactless Musical Instrument Interface</title><source>IEEE Electronic Library (IEL)</source><creator>W. Smith, Josiah ; Furxhi, Orges ; Torlak, Murat</creator><creatorcontrib>W. Smith, Josiah ; Furxhi, Orges ; Torlak, Murat</creatorcontrib><description>In this article, we propose a framework for contactless human-computer interaction (HCI) using novel tracking techniques based on deep learning-based super-resolution and tracking algorithms. Our system offers unprecedented high-resolution tracking of hand position and motion characteristics by leveraging spatial and temporal features embedded in the reflected radar waveform. Rather than classifying samples from a predefined set of hand gestures, as common in existing work on deep learning with mmWave radar, our proposed imager employs a regressive full convolutional neural network (FCNN) approach to improve localization accuracy by spatial super-resolution. While the proposed techniques are suitable for a host of tracking applications, this article focuses on their application as a musical interface to demonstrate the robustness of the gesture sensing pipeline and deep learning signal processing chain. The user can control the instrument by varying the position and velocity of their hand above the vertically-facing sensor. By employing a commercially available multiple-input-multiple-output (MIMO) radar rather than a traditional optical sensor, our framework demonstrates the efficacy of the mmWave sensing modality for fine motion tracking and offers an elegant solution to a host of HCI tasks. Additionally, we provide a freely available software package and user interface for controlling the device, streaming the data to MATLAB in real-time, and increasing accessibility to the signal processing and device interface functionality utilized in this article.</description><identifier>ISSN: 1520-9210</identifier><identifier>EISSN: 1941-0077</identifier><identifier>DOI: 10.1109/TMM.2021.3079695</identifier><identifier>CODEN: ITMUF8</identifier><language>eng</language><publisher>Piscataway: IEEE</publisher><subject>Algorithms ; Artificial neural networks ; Control equipment ; Deep learning ; fully-convolutional neural network (FCNN) ; Human computer interaction ; human-computer interaction (HCI) ; Human-computer interface ; Machine learning ; Millimeter waves ; millimeter-wave (mmWave) ; multiple-input multiple-output (MIMO) ; Music ; Musical instruments ; Optical measuring instruments ; Optical sensors ; Radar ; Radar imaging ; radar perception ; Radar tracking ; Signal processing ; super-resolution ; Tracking ; Waveforms</subject><ispartof>IEEE transactions on multimedia, 2022, Vol.24, p.2315-2328</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2022</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c333t-33e8e7624fa3de6c0a4a92ff0026f1e0669032eab23a5e8472f05226670311963</citedby><cites>FETCH-LOGICAL-c333t-33e8e7624fa3de6c0a4a92ff0026f1e0669032eab23a5e8472f05226670311963</cites><orcidid>0000-0001-7229-1765 ; 0000-0002-3388-4805</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/9429975$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>314,780,784,796,4024,27923,27924,27925,54758</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/9429975$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>W. Smith, Josiah</creatorcontrib><creatorcontrib>Furxhi, Orges</creatorcontrib><creatorcontrib>Torlak, Murat</creatorcontrib><title>An FCNN-Based Super-Resolution Mmwave Radar Framework for Contactless Musical Instrument Interface</title><title>IEEE transactions on multimedia</title><addtitle>TMM</addtitle><description>In this article, we propose a framework for contactless human-computer interaction (HCI) using novel tracking techniques based on deep learning-based super-resolution and tracking algorithms. Our system offers unprecedented high-resolution tracking of hand position and motion characteristics by leveraging spatial and temporal features embedded in the reflected radar waveform. Rather than classifying samples from a predefined set of hand gestures, as common in existing work on deep learning with mmWave radar, our proposed imager employs a regressive full convolutional neural network (FCNN) approach to improve localization accuracy by spatial super-resolution. While the proposed techniques are suitable for a host of tracking applications, this article focuses on their application as a musical interface to demonstrate the robustness of the gesture sensing pipeline and deep learning signal processing chain. The user can control the instrument by varying the position and velocity of their hand above the vertically-facing sensor. By employing a commercially available multiple-input-multiple-output (MIMO) radar rather than a traditional optical sensor, our framework demonstrates the efficacy of the mmWave sensing modality for fine motion tracking and offers an elegant solution to a host of HCI tasks. Additionally, we provide a freely available software package and user interface for controlling the device, streaming the data to MATLAB in real-time, and increasing accessibility to the signal processing and device interface functionality utilized in this article.</description><subject>Algorithms</subject><subject>Artificial neural networks</subject><subject>Control equipment</subject><subject>Deep learning</subject><subject>fully-convolutional neural network (FCNN)</subject><subject>Human computer interaction</subject><subject>human-computer interaction (HCI)</subject><subject>Human-computer interface</subject><subject>Machine learning</subject><subject>Millimeter waves</subject><subject>millimeter-wave (mmWave)</subject><subject>multiple-input multiple-output (MIMO)</subject><subject>Music</subject><subject>Musical instruments</subject><subject>Optical measuring instruments</subject><subject>Optical sensors</subject><subject>Radar</subject><subject>Radar imaging</subject><subject>radar perception</subject><subject>Radar tracking</subject><subject>Signal processing</subject><subject>super-resolution</subject><subject>Tracking</subject><subject>Waveforms</subject><issn>1520-9210</issn><issn>1941-0077</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2022</creationdate><recordtype>article</recordtype><sourceid>RIE</sourceid><recordid>eNo9kMFLwzAUxosoOKd3wUvAc-dL0ibLcQ6ng3XCnOeQdS_Q2TYzSR3-93ZseHrf4fd9D35Jck9hRCmop3VRjBgwOuIglVD5RTKgKqMpgJSXfc4ZpIpRuE5uQtgB0CwHOUg2k5bMpstl-mwCbslHt0efrjC4uouVa0nRHMwPkpXZGk9m3jR4cP6LWOfJ1LXRlLHGEEjRhao0NZm3IfquwTb2MaK3psTb5MqaOuDd-Q6Tz9nLevqWLt5f59PJIi055zHlHMcoBcus4VsUJZjMKGYtABOWIgihgDM0G8ZNjuNMMgs5Y0JI4JQqwYfJ42l37913hyHqnet827_UPcVAZRzynoITVXoXgker975qjP_VFPTRpO5N6qNJfTbZVx5OlQoR_3GVMaVkzv8AC3ZuEA</recordid><startdate>2022</startdate><enddate>2022</enddate><creator>W. Smith, Josiah</creator><creator>Furxhi, Orges</creator><creator>Torlak, Murat</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>RIA</scope><scope>RIE</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>7SP</scope><scope>8FD</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><orcidid>https://orcid.org/0000-0001-7229-1765</orcidid><orcidid>https://orcid.org/0000-0002-3388-4805</orcidid></search><sort><creationdate>2022</creationdate><title>An FCNN-Based Super-Resolution Mmwave Radar Framework for Contactless Musical Instrument Interface</title><author>W. Smith, Josiah ; Furxhi, Orges ; Torlak, Murat</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c333t-33e8e7624fa3de6c0a4a92ff0026f1e0669032eab23a5e8472f05226670311963</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2022</creationdate><topic>Algorithms</topic><topic>Artificial neural networks</topic><topic>Control equipment</topic><topic>Deep learning</topic><topic>fully-convolutional neural network (FCNN)</topic><topic>Human computer interaction</topic><topic>human-computer interaction (HCI)</topic><topic>Human-computer interface</topic><topic>Machine learning</topic><topic>Millimeter waves</topic><topic>millimeter-wave (mmWave)</topic><topic>multiple-input multiple-output (MIMO)</topic><topic>Music</topic><topic>Musical instruments</topic><topic>Optical measuring instruments</topic><topic>Optical sensors</topic><topic>Radar</topic><topic>Radar imaging</topic><topic>radar perception</topic><topic>Radar tracking</topic><topic>Signal processing</topic><topic>super-resolution</topic><topic>Tracking</topic><topic>Waveforms</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>W. Smith, Josiah</creatorcontrib><creatorcontrib>Furxhi, Orges</creatorcontrib><creatorcontrib>Torlak, Murat</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Electronics &amp; Communications Abstracts</collection><collection>Technology Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><jtitle>IEEE transactions on multimedia</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>W. Smith, Josiah</au><au>Furxhi, Orges</au><au>Torlak, Murat</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>An FCNN-Based Super-Resolution Mmwave Radar Framework for Contactless Musical Instrument Interface</atitle><jtitle>IEEE transactions on multimedia</jtitle><stitle>TMM</stitle><date>2022</date><risdate>2022</risdate><volume>24</volume><spage>2315</spage><epage>2328</epage><pages>2315-2328</pages><issn>1520-9210</issn><eissn>1941-0077</eissn><coden>ITMUF8</coden><abstract>In this article, we propose a framework for contactless human-computer interaction (HCI) using novel tracking techniques based on deep learning-based super-resolution and tracking algorithms. Our system offers unprecedented high-resolution tracking of hand position and motion characteristics by leveraging spatial and temporal features embedded in the reflected radar waveform. Rather than classifying samples from a predefined set of hand gestures, as common in existing work on deep learning with mmWave radar, our proposed imager employs a regressive full convolutional neural network (FCNN) approach to improve localization accuracy by spatial super-resolution. While the proposed techniques are suitable for a host of tracking applications, this article focuses on their application as a musical interface to demonstrate the robustness of the gesture sensing pipeline and deep learning signal processing chain. The user can control the instrument by varying the position and velocity of their hand above the vertically-facing sensor. By employing a commercially available multiple-input-multiple-output (MIMO) radar rather than a traditional optical sensor, our framework demonstrates the efficacy of the mmWave sensing modality for fine motion tracking and offers an elegant solution to a host of HCI tasks. Additionally, we provide a freely available software package and user interface for controlling the device, streaming the data to MATLAB in real-time, and increasing accessibility to the signal processing and device interface functionality utilized in this article.</abstract><cop>Piscataway</cop><pub>IEEE</pub><doi>10.1109/TMM.2021.3079695</doi><tpages>14</tpages><orcidid>https://orcid.org/0000-0001-7229-1765</orcidid><orcidid>https://orcid.org/0000-0002-3388-4805</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext_linktorsrc
identifier ISSN: 1520-9210
ispartof IEEE transactions on multimedia, 2022, Vol.24, p.2315-2328
issn 1520-9210
1941-0077
language eng
recordid cdi_ieee_primary_9429975
source IEEE Electronic Library (IEL)
subjects Algorithms
Artificial neural networks
Control equipment
Deep learning
fully-convolutional neural network (FCNN)
Human computer interaction
human-computer interaction (HCI)
Human-computer interface
Machine learning
Millimeter waves
millimeter-wave (mmWave)
multiple-input multiple-output (MIMO)
Music
Musical instruments
Optical measuring instruments
Optical sensors
Radar
Radar imaging
radar perception
Radar tracking
Signal processing
super-resolution
Tracking
Waveforms
title An FCNN-Based Super-Resolution Mmwave Radar Framework for Contactless Musical Instrument Interface
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-19T13%3A36%3A11IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_RIE&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=An%20FCNN-Based%20Super-Resolution%20Mmwave%20Radar%20Framework%20for%20Contactless%20Musical%20Instrument%20Interface&rft.jtitle=IEEE%20transactions%20on%20multimedia&rft.au=W.%20Smith,%20Josiah&rft.date=2022&rft.volume=24&rft.spage=2315&rft.epage=2328&rft.pages=2315-2328&rft.issn=1520-9210&rft.eissn=1941-0077&rft.coden=ITMUF8&rft_id=info:doi/10.1109/TMM.2021.3079695&rft_dat=%3Cproquest_RIE%3E2662094305%3C/proquest_RIE%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2662094305&rft_id=info:pmid/&rft_ieee_id=9429975&rfr_iscdi=true