MITNet: GAN Enhanced Magnetic Induction Tomography Based on Complex CNN
Magnetic induction tomography (MIT) is an efficient solution for long-term brain disease monitoring. It focuses on reconstructing the brain's bio-impedance distribution through nonintrusive electromagnetic fields. However, high-quality reconstruction of brain images remains a significant challe...
Gespeichert in:
Veröffentlicht in: | IEEE sensors journal 2024-10, Vol.24 (20), p.33573-33584 |
---|---|
Hauptverfasser: | , , , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 33584 |
---|---|
container_issue | 20 |
container_start_page | 33573 |
container_title | IEEE sensors journal |
container_volume | 24 |
creator | Chen, Zuohui Chen, Cheng Shao, Chongyang Cai, Chang Song, Xujie Chen, Cheng Xiang, Yun Liu, Ruigang Xuan, Qi |
description | Magnetic induction tomography (MIT) is an efficient solution for long-term brain disease monitoring. It focuses on reconstructing the brain's bio-impedance distribution through nonintrusive electromagnetic fields. However, high-quality reconstruction of brain images remains a significant challenge, as reconstructing images from weak and noisy signals is a highly nonlinear and ill-conditioned problem. In this work, we propose a generative adversarial network (GAN) enhanced MIT technique, named MITNet, based on a complex convolutional neural network (CNN). MITNet takes complex-valued signals as input and outputs a discretized conductivity distribution map. Our approach leverages the power of GANs to eliminate artifacts and enhance the reconstruction of object shapes. The experimental results on the real-world dataset validate the performance of our technique. The F1 score of MITNet surpasses the state-of-the-art stacked auto-encoder (SAE) method by 5.33% on the agar data. |
doi_str_mv | 10.1109/JSEN.2024.3350742 |
format | Article |
fullrecord | <record><control><sourceid>crossref_RIE</sourceid><recordid>TN_cdi_ieee_primary_10439020</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>10439020</ieee_id><sourcerecordid>10_1109_JSEN_2024_3350742</sourcerecordid><originalsourceid>FETCH-LOGICAL-c261t-2cc81028074b40c2fa013c0037ef8ac17fdcad8a18de7e6562562afa6e170f5c3</originalsourceid><addsrcrecordid>eNpNkEFLw0AQhRdRsFZ_gOBh_0DizG6S3XqrIY2VNh6M4C2sm9k20iYliWD_vQntQRiYYXjv8fgYu0fwEWH2-PqeZL4AEfhShqACccEmGIbaQxXoy_GW4AVSfV6zm677BsCZCtWEpetlnlH_xNN5xpN6a2pLJV-bTU19ZfmyLn9sXzU1z5t9s2nNYXvkz6YbNMMvbvaHHf3yOMtu2ZUzu47uznvKPhZJHr94q7d0Gc9XnhUR9p6wViMIPTT8CsAKZwClBZCKnDYWlSutKbVBXZKiKIzEMMaZiFCBC62cMjzl2rbpupZccWirvWmPBUIxkihGEsVIojiTGDwPJ09FRP_0gZyBAPkHQadZJg</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>MITNet: GAN Enhanced Magnetic Induction Tomography Based on Complex CNN</title><source>IEEE Electronic Library (IEL)</source><creator>Chen, Zuohui ; Chen, Cheng ; Shao, Chongyang ; Cai, Chang ; Song, Xujie ; Chen, Cheng ; Xiang, Yun ; Liu, Ruigang ; Xuan, Qi</creator><creatorcontrib>Chen, Zuohui ; Chen, Cheng ; Shao, Chongyang ; Cai, Chang ; Song, Xujie ; Chen, Cheng ; Xiang, Yun ; Liu, Ruigang ; Xuan, Qi</creatorcontrib><description>Magnetic induction tomography (MIT) is an efficient solution for long-term brain disease monitoring. It focuses on reconstructing the brain's bio-impedance distribution through nonintrusive electromagnetic fields. However, high-quality reconstruction of brain images remains a significant challenge, as reconstructing images from weak and noisy signals is a highly nonlinear and ill-conditioned problem. In this work, we propose a generative adversarial network (GAN) enhanced MIT technique, named MITNet, based on a complex convolutional neural network (CNN). MITNet takes complex-valued signals as input and outputs a discretized conductivity distribution map. Our approach leverages the power of GANs to eliminate artifacts and enhance the reconstruction of object shapes. The experimental results on the real-world dataset validate the performance of our technique. The F1 score of MITNet surpasses the state-of-the-art stacked auto-encoder (SAE) method by 5.33% on the agar data.</description><identifier>ISSN: 1530-437X</identifier><identifier>EISSN: 1558-1748</identifier><identifier>DOI: 10.1109/JSEN.2024.3350742</identifier><identifier>CODEN: ISJEAZ</identifier><language>eng</language><publisher>IEEE</publisher><subject>Conductivity ; Deep neural network (DNN) ; Diseases ; electromagnetic inversion ; electromagnetic tomography ; Generative adversarial networks ; generative adversarial networks (GANs) ; Image reconstruction ; magnetic induction tomography (MIT) ; Magnetic resonance imaging ; Mathematical models ; Tomography</subject><ispartof>IEEE sensors journal, 2024-10, Vol.24 (20), p.33573-33584</ispartof><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><cites>FETCH-LOGICAL-c261t-2cc81028074b40c2fa013c0037ef8ac17fdcad8a18de7e6562562afa6e170f5c3</cites><orcidid>0000-0003-1163-698X ; 0009-0009-1549-5310 ; 0000-0003-1806-6676 ; 0000-0002-6525-4293 ; 0000-0002-1042-470X</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/10439020$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>314,780,784,796,27922,27923,54756</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/10439020$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Chen, Zuohui</creatorcontrib><creatorcontrib>Chen, Cheng</creatorcontrib><creatorcontrib>Shao, Chongyang</creatorcontrib><creatorcontrib>Cai, Chang</creatorcontrib><creatorcontrib>Song, Xujie</creatorcontrib><creatorcontrib>Chen, Cheng</creatorcontrib><creatorcontrib>Xiang, Yun</creatorcontrib><creatorcontrib>Liu, Ruigang</creatorcontrib><creatorcontrib>Xuan, Qi</creatorcontrib><title>MITNet: GAN Enhanced Magnetic Induction Tomography Based on Complex CNN</title><title>IEEE sensors journal</title><addtitle>JSEN</addtitle><description>Magnetic induction tomography (MIT) is an efficient solution for long-term brain disease monitoring. It focuses on reconstructing the brain's bio-impedance distribution through nonintrusive electromagnetic fields. However, high-quality reconstruction of brain images remains a significant challenge, as reconstructing images from weak and noisy signals is a highly nonlinear and ill-conditioned problem. In this work, we propose a generative adversarial network (GAN) enhanced MIT technique, named MITNet, based on a complex convolutional neural network (CNN). MITNet takes complex-valued signals as input and outputs a discretized conductivity distribution map. Our approach leverages the power of GANs to eliminate artifacts and enhance the reconstruction of object shapes. The experimental results on the real-world dataset validate the performance of our technique. The F1 score of MITNet surpasses the state-of-the-art stacked auto-encoder (SAE) method by 5.33% on the agar data.</description><subject>Conductivity</subject><subject>Deep neural network (DNN)</subject><subject>Diseases</subject><subject>electromagnetic inversion</subject><subject>electromagnetic tomography</subject><subject>Generative adversarial networks</subject><subject>generative adversarial networks (GANs)</subject><subject>Image reconstruction</subject><subject>magnetic induction tomography (MIT)</subject><subject>Magnetic resonance imaging</subject><subject>Mathematical models</subject><subject>Tomography</subject><issn>1530-437X</issn><issn>1558-1748</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><sourceid>RIE</sourceid><recordid>eNpNkEFLw0AQhRdRsFZ_gOBh_0DizG6S3XqrIY2VNh6M4C2sm9k20iYliWD_vQntQRiYYXjv8fgYu0fwEWH2-PqeZL4AEfhShqACccEmGIbaQxXoy_GW4AVSfV6zm677BsCZCtWEpetlnlH_xNN5xpN6a2pLJV-bTU19ZfmyLn9sXzU1z5t9s2nNYXvkz6YbNMMvbvaHHf3yOMtu2ZUzu47uznvKPhZJHr94q7d0Gc9XnhUR9p6wViMIPTT8CsAKZwClBZCKnDYWlSutKbVBXZKiKIzEMMaZiFCBC62cMjzl2rbpupZccWirvWmPBUIxkihGEsVIojiTGDwPJ09FRP_0gZyBAPkHQadZJg</recordid><startdate>20241015</startdate><enddate>20241015</enddate><creator>Chen, Zuohui</creator><creator>Chen, Cheng</creator><creator>Shao, Chongyang</creator><creator>Cai, Chang</creator><creator>Song, Xujie</creator><creator>Chen, Cheng</creator><creator>Xiang, Yun</creator><creator>Liu, Ruigang</creator><creator>Xuan, Qi</creator><general>IEEE</general><scope>97E</scope><scope>RIA</scope><scope>RIE</scope><scope>AAYXX</scope><scope>CITATION</scope><orcidid>https://orcid.org/0000-0003-1163-698X</orcidid><orcidid>https://orcid.org/0009-0009-1549-5310</orcidid><orcidid>https://orcid.org/0000-0003-1806-6676</orcidid><orcidid>https://orcid.org/0000-0002-6525-4293</orcidid><orcidid>https://orcid.org/0000-0002-1042-470X</orcidid></search><sort><creationdate>20241015</creationdate><title>MITNet: GAN Enhanced Magnetic Induction Tomography Based on Complex CNN</title><author>Chen, Zuohui ; Chen, Cheng ; Shao, Chongyang ; Cai, Chang ; Song, Xujie ; Chen, Cheng ; Xiang, Yun ; Liu, Ruigang ; Xuan, Qi</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c261t-2cc81028074b40c2fa013c0037ef8ac17fdcad8a18de7e6562562afa6e170f5c3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Conductivity</topic><topic>Deep neural network (DNN)</topic><topic>Diseases</topic><topic>electromagnetic inversion</topic><topic>electromagnetic tomography</topic><topic>Generative adversarial networks</topic><topic>generative adversarial networks (GANs)</topic><topic>Image reconstruction</topic><topic>magnetic induction tomography (MIT)</topic><topic>Magnetic resonance imaging</topic><topic>Mathematical models</topic><topic>Tomography</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Chen, Zuohui</creatorcontrib><creatorcontrib>Chen, Cheng</creatorcontrib><creatorcontrib>Shao, Chongyang</creatorcontrib><creatorcontrib>Cai, Chang</creatorcontrib><creatorcontrib>Song, Xujie</creatorcontrib><creatorcontrib>Chen, Cheng</creatorcontrib><creatorcontrib>Xiang, Yun</creatorcontrib><creatorcontrib>Liu, Ruigang</creatorcontrib><creatorcontrib>Xuan, Qi</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>CrossRef</collection><jtitle>IEEE sensors journal</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Chen, Zuohui</au><au>Chen, Cheng</au><au>Shao, Chongyang</au><au>Cai, Chang</au><au>Song, Xujie</au><au>Chen, Cheng</au><au>Xiang, Yun</au><au>Liu, Ruigang</au><au>Xuan, Qi</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>MITNet: GAN Enhanced Magnetic Induction Tomography Based on Complex CNN</atitle><jtitle>IEEE sensors journal</jtitle><stitle>JSEN</stitle><date>2024-10-15</date><risdate>2024</risdate><volume>24</volume><issue>20</issue><spage>33573</spage><epage>33584</epage><pages>33573-33584</pages><issn>1530-437X</issn><eissn>1558-1748</eissn><coden>ISJEAZ</coden><abstract>Magnetic induction tomography (MIT) is an efficient solution for long-term brain disease monitoring. It focuses on reconstructing the brain's bio-impedance distribution through nonintrusive electromagnetic fields. However, high-quality reconstruction of brain images remains a significant challenge, as reconstructing images from weak and noisy signals is a highly nonlinear and ill-conditioned problem. In this work, we propose a generative adversarial network (GAN) enhanced MIT technique, named MITNet, based on a complex convolutional neural network (CNN). MITNet takes complex-valued signals as input and outputs a discretized conductivity distribution map. Our approach leverages the power of GANs to eliminate artifacts and enhance the reconstruction of object shapes. The experimental results on the real-world dataset validate the performance of our technique. The F1 score of MITNet surpasses the state-of-the-art stacked auto-encoder (SAE) method by 5.33% on the agar data.</abstract><pub>IEEE</pub><doi>10.1109/JSEN.2024.3350742</doi><tpages>12</tpages><orcidid>https://orcid.org/0000-0003-1163-698X</orcidid><orcidid>https://orcid.org/0009-0009-1549-5310</orcidid><orcidid>https://orcid.org/0000-0003-1806-6676</orcidid><orcidid>https://orcid.org/0000-0002-6525-4293</orcidid><orcidid>https://orcid.org/0000-0002-1042-470X</orcidid><oa>free_for_read</oa></addata></record> |
fulltext | fulltext_linktorsrc |
identifier | ISSN: 1530-437X |
ispartof | IEEE sensors journal, 2024-10, Vol.24 (20), p.33573-33584 |
issn | 1530-437X 1558-1748 |
language | eng |
recordid | cdi_ieee_primary_10439020 |
source | IEEE Electronic Library (IEL) |
subjects | Conductivity Deep neural network (DNN) Diseases electromagnetic inversion electromagnetic tomography Generative adversarial networks generative adversarial networks (GANs) Image reconstruction magnetic induction tomography (MIT) Magnetic resonance imaging Mathematical models Tomography |
title | MITNet: GAN Enhanced Magnetic Induction Tomography Based on Complex CNN |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-10T03%3A08%3A35IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-crossref_RIE&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=MITNet:%20GAN%20Enhanced%20Magnetic%20Induction%20Tomography%20Based%20on%20Complex%20CNN&rft.jtitle=IEEE%20sensors%20journal&rft.au=Chen,%20Zuohui&rft.date=2024-10-15&rft.volume=24&rft.issue=20&rft.spage=33573&rft.epage=33584&rft.pages=33573-33584&rft.issn=1530-437X&rft.eissn=1558-1748&rft.coden=ISJEAZ&rft_id=info:doi/10.1109/JSEN.2024.3350742&rft_dat=%3Ccrossref_RIE%3E10_1109_JSEN_2024_3350742%3C/crossref_RIE%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rft_ieee_id=10439020&rfr_iscdi=true |