Microscopic Hyperspectral Image Classification Based on Fusion Transformer with Parallel CNN
Microscopic hyperspectral image (MHSI) has received considerable attention in the medical field. The wealthy spectral information provides potentially powerful identification ability when combining with advanced convolutional neural network (CNN). However, for high-dimensional MHSI, the local connec...
Gespeichert in:
Veröffentlicht in: | IEEE journal of biomedical and health informatics 2023-06, Vol.27 (6), p.1-12 |
---|---|
Hauptverfasser: | , , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 12 |
---|---|
container_issue | 6 |
container_start_page | 1 |
container_title | IEEE journal of biomedical and health informatics |
container_volume | 27 |
creator | Zeng, Weijia Li, Wei Zhang, Mengmeng Wang, Hao Lv, Meng Yang, Yue Tao, Ran |
description | Microscopic hyperspectral image (MHSI) has received considerable attention in the medical field. The wealthy spectral information provides potentially powerful identification ability when combining with advanced convolutional neural network (CNN). However, for high-dimensional MHSI, the local connection of CNN makes it difficult to extract the long-range dependencies of spectral bands. Transformer overcomes this problem well because of its self-attention mechanism. Nevertheless, transformer is inferior to CNN in extracting spatial detailed features. Therefore, a classification framework integrating transformer and CNN in parallel, named as Fusion Transformer (FUST), is proposed for MHSI classification tasks. Specifically, the transformer branch is employed to extract the overall semantics and capture the long-range dependencies of spectral bands to highlight the key spectral information. The parallel CNN branch is designed to extract significant multiscale spatial features. Furthermore, the feature fusion module is developed to effectively fuse and process the features extracted by the two branches. Experimental results on three MHSI datasets demonstrate that the proposed FUST achieves superior performance when compared with state-of-the-art methods. |
doi_str_mv | 10.1109/JBHI.2023.3253722 |
format | Article |
fullrecord | <record><control><sourceid>proquest_RIE</sourceid><recordid>TN_cdi_proquest_journals_2823186581</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>10061535</ieee_id><sourcerecordid>2798708497</sourcerecordid><originalsourceid>FETCH-LOGICAL-c350t-b9c6a3e1530fa66f7a8f64e741d7d760f75e02e841d0de8bbc7ead3cad1f8e543</originalsourceid><addsrcrecordid>eNpdkF1LwzAUhoMoOuZ-gCBS8MabzXy0SXqpw7nJnF7onRCy9EQj7VqTFvHfm7IpYm5yTnjew8mD0AnBE0Jwfnl3PV9MKKZswmjGBKV7aEAJl2NKsdz_qUmeHqFRCO84Hhmfcn6IjpjAVMbYAL3cO-PrYOrGmWT-1YAPDZjW6zJZVPoVkmmpQ3DWGd26epNc6wBFEotZF_r-yetNsLWvwCefrn1LHnXMllAm09XqGB1YXQYY7e4hep7dPE3n4-XD7WJ6tRwbluF2vM4N1wxIxrDVnFuhpeUpiJQUohAcW5EBpiBjjwuQ67URoAtmdEGshCxlQ3Sxndv4-qOD0KrKBQNlqTdQd0FRkUuBZZqLiJ7_Q9_rzm_idopKyojkmSSRIluqdxM8WNV4V2n_pQhWvX3V21e9fbWzHzNnu8nduoLiN_HjOgKnW8ABwJ-BmMefZ-wbWHeIqw</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2823186581</pqid></control><display><type>article</type><title>Microscopic Hyperspectral Image Classification Based on Fusion Transformer with Parallel CNN</title><source>IEEE Electronic Library (IEL)</source><creator>Zeng, Weijia ; Li, Wei ; Zhang, Mengmeng ; Wang, Hao ; Lv, Meng ; Yang, Yue ; Tao, Ran</creator><creatorcontrib>Zeng, Weijia ; Li, Wei ; Zhang, Mengmeng ; Wang, Hao ; Lv, Meng ; Yang, Yue ; Tao, Ran</creatorcontrib><description>Microscopic hyperspectral image (MHSI) has received considerable attention in the medical field. The wealthy spectral information provides potentially powerful identification ability when combining with advanced convolutional neural network (CNN). However, for high-dimensional MHSI, the local connection of CNN makes it difficult to extract the long-range dependencies of spectral bands. Transformer overcomes this problem well because of its self-attention mechanism. Nevertheless, transformer is inferior to CNN in extracting spatial detailed features. Therefore, a classification framework integrating transformer and CNN in parallel, named as Fusion Transformer (FUST), is proposed for MHSI classification tasks. Specifically, the transformer branch is employed to extract the overall semantics and capture the long-range dependencies of spectral bands to highlight the key spectral information. The parallel CNN branch is designed to extract significant multiscale spatial features. Furthermore, the feature fusion module is developed to effectively fuse and process the features extracted by the two branches. Experimental results on three MHSI datasets demonstrate that the proposed FUST achieves superior performance when compared with state-of-the-art methods.</description><identifier>ISSN: 2168-2194</identifier><identifier>EISSN: 2168-2208</identifier><identifier>DOI: 10.1109/JBHI.2023.3253722</identifier><identifier>PMID: 37028325</identifier><identifier>CODEN: IJBHA9</identifier><language>eng</language><publisher>United States: IEEE</publisher><subject>Artificial neural networks ; Band spectra ; Classification ; Convolutional Neural Network (CNN) ; Convolutional neural networks ; Data mining ; Feature extraction ; feature fusion ; Hyperspectral imaging ; Image classification ; microscopic hyperspectral image (MHSI) ; Microscopy ; Neural networks ; Semantics ; Spectral bands ; Task analysis ; transformer ; Transformers</subject><ispartof>IEEE journal of biomedical and health informatics, 2023-06, Vol.27 (6), p.1-12</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2023</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c350t-b9c6a3e1530fa66f7a8f64e741d7d760f75e02e841d0de8bbc7ead3cad1f8e543</citedby><cites>FETCH-LOGICAL-c350t-b9c6a3e1530fa66f7a8f64e741d7d760f75e02e841d0de8bbc7ead3cad1f8e543</cites><orcidid>0000-0002-5724-9785 ; 0000-0001-7015-7335 ; 0000-0002-2804-4342 ; 0000-0002-5243-7189 ; 0000-0003-1160-2528</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/10061535$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>314,776,780,792,27903,27904,54736</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/10061535$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/37028325$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Zeng, Weijia</creatorcontrib><creatorcontrib>Li, Wei</creatorcontrib><creatorcontrib>Zhang, Mengmeng</creatorcontrib><creatorcontrib>Wang, Hao</creatorcontrib><creatorcontrib>Lv, Meng</creatorcontrib><creatorcontrib>Yang, Yue</creatorcontrib><creatorcontrib>Tao, Ran</creatorcontrib><title>Microscopic Hyperspectral Image Classification Based on Fusion Transformer with Parallel CNN</title><title>IEEE journal of biomedical and health informatics</title><addtitle>JBHI</addtitle><addtitle>IEEE J Biomed Health Inform</addtitle><description>Microscopic hyperspectral image (MHSI) has received considerable attention in the medical field. The wealthy spectral information provides potentially powerful identification ability when combining with advanced convolutional neural network (CNN). However, for high-dimensional MHSI, the local connection of CNN makes it difficult to extract the long-range dependencies of spectral bands. Transformer overcomes this problem well because of its self-attention mechanism. Nevertheless, transformer is inferior to CNN in extracting spatial detailed features. Therefore, a classification framework integrating transformer and CNN in parallel, named as Fusion Transformer (FUST), is proposed for MHSI classification tasks. Specifically, the transformer branch is employed to extract the overall semantics and capture the long-range dependencies of spectral bands to highlight the key spectral information. The parallel CNN branch is designed to extract significant multiscale spatial features. Furthermore, the feature fusion module is developed to effectively fuse and process the features extracted by the two branches. Experimental results on three MHSI datasets demonstrate that the proposed FUST achieves superior performance when compared with state-of-the-art methods.</description><subject>Artificial neural networks</subject><subject>Band spectra</subject><subject>Classification</subject><subject>Convolutional Neural Network (CNN)</subject><subject>Convolutional neural networks</subject><subject>Data mining</subject><subject>Feature extraction</subject><subject>feature fusion</subject><subject>Hyperspectral imaging</subject><subject>Image classification</subject><subject>microscopic hyperspectral image (MHSI)</subject><subject>Microscopy</subject><subject>Neural networks</subject><subject>Semantics</subject><subject>Spectral bands</subject><subject>Task analysis</subject><subject>transformer</subject><subject>Transformers</subject><issn>2168-2194</issn><issn>2168-2208</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><sourceid>RIE</sourceid><recordid>eNpdkF1LwzAUhoMoOuZ-gCBS8MabzXy0SXqpw7nJnF7onRCy9EQj7VqTFvHfm7IpYm5yTnjew8mD0AnBE0Jwfnl3PV9MKKZswmjGBKV7aEAJl2NKsdz_qUmeHqFRCO84Hhmfcn6IjpjAVMbYAL3cO-PrYOrGmWT-1YAPDZjW6zJZVPoVkmmpQ3DWGd26epNc6wBFEotZF_r-yetNsLWvwCefrn1LHnXMllAm09XqGB1YXQYY7e4hep7dPE3n4-XD7WJ6tRwbluF2vM4N1wxIxrDVnFuhpeUpiJQUohAcW5EBpiBjjwuQ67URoAtmdEGshCxlQ3Sxndv4-qOD0KrKBQNlqTdQd0FRkUuBZZqLiJ7_Q9_rzm_idopKyojkmSSRIluqdxM8WNV4V2n_pQhWvX3V21e9fbWzHzNnu8nduoLiN_HjOgKnW8ABwJ-BmMefZ-wbWHeIqw</recordid><startdate>20230601</startdate><enddate>20230601</enddate><creator>Zeng, Weijia</creator><creator>Li, Wei</creator><creator>Zhang, Mengmeng</creator><creator>Wang, Hao</creator><creator>Lv, Meng</creator><creator>Yang, Yue</creator><creator>Tao, Ran</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>RIA</scope><scope>RIE</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7QF</scope><scope>7QO</scope><scope>7QQ</scope><scope>7SC</scope><scope>7SE</scope><scope>7SP</scope><scope>7SR</scope><scope>7TA</scope><scope>7TB</scope><scope>7U5</scope><scope>8BQ</scope><scope>8FD</scope><scope>F28</scope><scope>FR3</scope><scope>H8D</scope><scope>JG9</scope><scope>JQ2</scope><scope>K9.</scope><scope>KR7</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>NAPCQ</scope><scope>P64</scope><scope>7X8</scope><orcidid>https://orcid.org/0000-0002-5724-9785</orcidid><orcidid>https://orcid.org/0000-0001-7015-7335</orcidid><orcidid>https://orcid.org/0000-0002-2804-4342</orcidid><orcidid>https://orcid.org/0000-0002-5243-7189</orcidid><orcidid>https://orcid.org/0000-0003-1160-2528</orcidid></search><sort><creationdate>20230601</creationdate><title>Microscopic Hyperspectral Image Classification Based on Fusion Transformer with Parallel CNN</title><author>Zeng, Weijia ; Li, Wei ; Zhang, Mengmeng ; Wang, Hao ; Lv, Meng ; Yang, Yue ; Tao, Ran</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c350t-b9c6a3e1530fa66f7a8f64e741d7d760f75e02e841d0de8bbc7ead3cad1f8e543</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><topic>Artificial neural networks</topic><topic>Band spectra</topic><topic>Classification</topic><topic>Convolutional Neural Network (CNN)</topic><topic>Convolutional neural networks</topic><topic>Data mining</topic><topic>Feature extraction</topic><topic>feature fusion</topic><topic>Hyperspectral imaging</topic><topic>Image classification</topic><topic>microscopic hyperspectral image (MHSI)</topic><topic>Microscopy</topic><topic>Neural networks</topic><topic>Semantics</topic><topic>Spectral bands</topic><topic>Task analysis</topic><topic>transformer</topic><topic>Transformers</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Zeng, Weijia</creatorcontrib><creatorcontrib>Li, Wei</creatorcontrib><creatorcontrib>Zhang, Mengmeng</creatorcontrib><creatorcontrib>Wang, Hao</creatorcontrib><creatorcontrib>Lv, Meng</creatorcontrib><creatorcontrib>Yang, Yue</creatorcontrib><creatorcontrib>Tao, Ran</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>Aluminium Industry Abstracts</collection><collection>Biotechnology Research Abstracts</collection><collection>Ceramic Abstracts</collection><collection>Computer and Information Systems Abstracts</collection><collection>Corrosion Abstracts</collection><collection>Electronics & Communications Abstracts</collection><collection>Engineered Materials Abstracts</collection><collection>Materials Business File</collection><collection>Mechanical & Transportation Engineering Abstracts</collection><collection>Solid State and Superconductivity Abstracts</collection><collection>METADEX</collection><collection>Technology Research Database</collection><collection>ANTE: Abstracts in New Technology & Engineering</collection><collection>Engineering Research Database</collection><collection>Aerospace Database</collection><collection>Materials Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>ProQuest Health & Medical Complete (Alumni)</collection><collection>Civil Engineering Abstracts</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>Nursing & Allied Health Premium</collection><collection>Biotechnology and BioEngineering Abstracts</collection><collection>MEDLINE - Academic</collection><jtitle>IEEE journal of biomedical and health informatics</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Zeng, Weijia</au><au>Li, Wei</au><au>Zhang, Mengmeng</au><au>Wang, Hao</au><au>Lv, Meng</au><au>Yang, Yue</au><au>Tao, Ran</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Microscopic Hyperspectral Image Classification Based on Fusion Transformer with Parallel CNN</atitle><jtitle>IEEE journal of biomedical and health informatics</jtitle><stitle>JBHI</stitle><addtitle>IEEE J Biomed Health Inform</addtitle><date>2023-06-01</date><risdate>2023</risdate><volume>27</volume><issue>6</issue><spage>1</spage><epage>12</epage><pages>1-12</pages><issn>2168-2194</issn><eissn>2168-2208</eissn><coden>IJBHA9</coden><abstract>Microscopic hyperspectral image (MHSI) has received considerable attention in the medical field. The wealthy spectral information provides potentially powerful identification ability when combining with advanced convolutional neural network (CNN). However, for high-dimensional MHSI, the local connection of CNN makes it difficult to extract the long-range dependencies of spectral bands. Transformer overcomes this problem well because of its self-attention mechanism. Nevertheless, transformer is inferior to CNN in extracting spatial detailed features. Therefore, a classification framework integrating transformer and CNN in parallel, named as Fusion Transformer (FUST), is proposed for MHSI classification tasks. Specifically, the transformer branch is employed to extract the overall semantics and capture the long-range dependencies of spectral bands to highlight the key spectral information. The parallel CNN branch is designed to extract significant multiscale spatial features. Furthermore, the feature fusion module is developed to effectively fuse and process the features extracted by the two branches. Experimental results on three MHSI datasets demonstrate that the proposed FUST achieves superior performance when compared with state-of-the-art methods.</abstract><cop>United States</cop><pub>IEEE</pub><pmid>37028325</pmid><doi>10.1109/JBHI.2023.3253722</doi><tpages>12</tpages><orcidid>https://orcid.org/0000-0002-5724-9785</orcidid><orcidid>https://orcid.org/0000-0001-7015-7335</orcidid><orcidid>https://orcid.org/0000-0002-2804-4342</orcidid><orcidid>https://orcid.org/0000-0002-5243-7189</orcidid><orcidid>https://orcid.org/0000-0003-1160-2528</orcidid></addata></record> |
fulltext | fulltext_linktorsrc |
identifier | ISSN: 2168-2194 |
ispartof | IEEE journal of biomedical and health informatics, 2023-06, Vol.27 (6), p.1-12 |
issn | 2168-2194 2168-2208 |
language | eng |
recordid | cdi_proquest_journals_2823186581 |
source | IEEE Electronic Library (IEL) |
subjects | Artificial neural networks Band spectra Classification Convolutional Neural Network (CNN) Convolutional neural networks Data mining Feature extraction feature fusion Hyperspectral imaging Image classification microscopic hyperspectral image (MHSI) Microscopy Neural networks Semantics Spectral bands Task analysis transformer Transformers |
title | Microscopic Hyperspectral Image Classification Based on Fusion Transformer with Parallel CNN |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-27T16%3A18%3A21IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_RIE&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Microscopic%20Hyperspectral%20Image%20Classification%20Based%20on%20Fusion%20Transformer%20with%20Parallel%20CNN&rft.jtitle=IEEE%20journal%20of%20biomedical%20and%20health%20informatics&rft.au=Zeng,%20Weijia&rft.date=2023-06-01&rft.volume=27&rft.issue=6&rft.spage=1&rft.epage=12&rft.pages=1-12&rft.issn=2168-2194&rft.eissn=2168-2208&rft.coden=IJBHA9&rft_id=info:doi/10.1109/JBHI.2023.3253722&rft_dat=%3Cproquest_RIE%3E2798708497%3C/proquest_RIE%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2823186581&rft_id=info:pmid/37028325&rft_ieee_id=10061535&rfr_iscdi=true |