SSPNet: An interpretable 3D-CNN for classification of schizophrenia using phase maps of resting-state complex-valued fMRI data
•SSPNet was proposed for schizophrenia classification with interpretability modules.•Phase (SSP) maps were used as 3D-CNN inputs to denoise complex-valued fMRI data.•Saliency maps were generated to provide insight into the relevant brain regions.•Grad-CAM was used to localize decision-making regions...
Gespeichert in:
Veröffentlicht in: | Medical image analysis 2022-07, Vol.79, p.102430, Article 102430 |
---|---|
Hauptverfasser: | , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | |
---|---|
container_issue | |
container_start_page | 102430 |
container_title | Medical image analysis |
container_volume | 79 |
creator | Lin, Qiu-Hua Niu, Yan-Wei Sui, Jing Zhao, Wen-Da Zhuo, Chuanjun Calhoun, Vince D. |
description | •SSPNet was proposed for schizophrenia classification with interpretability modules.•Phase (SSP) maps were used as 3D-CNN inputs to denoise complex-valued fMRI data.•Saliency maps were generated to provide insight into the relevant brain regions.•Grad-CAM was used to localize decision-making regions within SSP maps.•SSPNet significantly improved classification compared to CNN using magnitude maps.
Convolutional neural networks (CNNs) have shown promising results in classifying individuals with mental disorders such as schizophrenia using resting-state fMRI data. However, complex-valued fMRI data is rarely used since additional phase data introduces high-level noise though it is potentially useful information for the context of classification. As such, we propose to use spatial source phase (SSP) maps derived from complex-valued fMRI data as the CNN input. The SSP maps are not only less noisy, but also more sensitive to spatial activation changes caused by mental disorders than magnitude maps. We build a 3D-CNN framework with two convolutional layers (named SSPNet) to fully explore the 3D structure and voxel-level relationships from the SSP maps. Two interpretability modules, consisting of saliency map generation and gradient-weighted class activation mapping (Grad-CAM), are incorporated into the well-trained SSPNet to provide additional information helpful for understanding the output. Experimental results from classifying schizophrenia patients (SZs) and healthy controls (HCs) show that the proposed SSPNet significantly improved accuracy and AUC compared to CNN using magnitude maps extracted from either magnitude-only (by 23.4 and 23.6% for DMN) or complex-valued fMRI data (by 10.6 and 5.8% for DMN). SSPNet captured more prominent HC-SZ differences in saliency maps, and Grad-CAM localized all contributing brain regions with opposite strengths for HCs and SZs within SSP maps. These results indicate the potential of SSPNet as a sensitive tool that may be useful for the development of brain-based biomarkers of mental disorders. |
doi_str_mv | 10.1016/j.media.2022.102430 |
format | Article |
fullrecord | <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_miscellaneous_2648899800</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><els_id>S1361841522000810</els_id><sourcerecordid>2726060099</sourcerecordid><originalsourceid>FETCH-LOGICAL-c432t-8aa8cbe734eb4d0b575853f0cd0712552c5b7c57857bb055ad18de2becebe2363</originalsourceid><addsrcrecordid>eNp9kU1v1DAQhiNERUvhFyAhS1y4ZBnbcZwgcaiWj1YqC6JwtmxnwnqVxMF2KuDQ3463W3rgwMnW-Jl3Rn6K4hmFFQVav9qtRuycXjFgLFdYxeFBcUJ5TcumYvzh_Z2K4-JxjDsAkFUFj4pjLngrKwknxc3V1ecNptfkbCJuShjmgEmbAQl_W643G9L7QOygY3S9szo5PxHfk2i37reftwEnp8kS3fSdzFsdkYx6jnsiYEy5WsakExLrx3nAn-W1HhbsSP_xywXpdNJPiqNeDxGf3p2nxbf3776uz8vLTx8u1meXpa04S2WjdWMNSl6hqTowQopG8B5sB5IyIZgVRlohGyGNASF0R5sOmUGLBhmv-Wnx8pA7B_9jyaup0UWLw6An9EtUrK6apm0bgIy--Afd-SVMeTvFJKuhBmjbTPEDZYOPMWCv5uBGHX4pCmqvR-3UrR6116MOenLX87vsxeTX-56_PjLw5gBg_oxrh0FF63CyOSmgTarz7r8D_gCCoqGL</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2726060099</pqid></control><display><type>article</type><title>SSPNet: An interpretable 3D-CNN for classification of schizophrenia using phase maps of resting-state complex-valued fMRI data</title><source>MEDLINE</source><source>Access via ScienceDirect (Elsevier)</source><creator>Lin, Qiu-Hua ; Niu, Yan-Wei ; Sui, Jing ; Zhao, Wen-Da ; Zhuo, Chuanjun ; Calhoun, Vince D.</creator><creatorcontrib>Lin, Qiu-Hua ; Niu, Yan-Wei ; Sui, Jing ; Zhao, Wen-Da ; Zhuo, Chuanjun ; Calhoun, Vince D.</creatorcontrib><description>•SSPNet was proposed for schizophrenia classification with interpretability modules.•Phase (SSP) maps were used as 3D-CNN inputs to denoise complex-valued fMRI data.•Saliency maps were generated to provide insight into the relevant brain regions.•Grad-CAM was used to localize decision-making regions within SSP maps.•SSPNet significantly improved classification compared to CNN using magnitude maps.
Convolutional neural networks (CNNs) have shown promising results in classifying individuals with mental disorders such as schizophrenia using resting-state fMRI data. However, complex-valued fMRI data is rarely used since additional phase data introduces high-level noise though it is potentially useful information for the context of classification. As such, we propose to use spatial source phase (SSP) maps derived from complex-valued fMRI data as the CNN input. The SSP maps are not only less noisy, but also more sensitive to spatial activation changes caused by mental disorders than magnitude maps. We build a 3D-CNN framework with two convolutional layers (named SSPNet) to fully explore the 3D structure and voxel-level relationships from the SSP maps. Two interpretability modules, consisting of saliency map generation and gradient-weighted class activation mapping (Grad-CAM), are incorporated into the well-trained SSPNet to provide additional information helpful for understanding the output. Experimental results from classifying schizophrenia patients (SZs) and healthy controls (HCs) show that the proposed SSPNet significantly improved accuracy and AUC compared to CNN using magnitude maps extracted from either magnitude-only (by 23.4 and 23.6% for DMN) or complex-valued fMRI data (by 10.6 and 5.8% for DMN). SSPNet captured more prominent HC-SZ differences in saliency maps, and Grad-CAM localized all contributing brain regions with opposite strengths for HCs and SZs within SSP maps. These results indicate the potential of SSPNet as a sensitive tool that may be useful for the development of brain-based biomarkers of mental disorders.</description><identifier>ISSN: 1361-8415</identifier><identifier>ISSN: 1361-8423</identifier><identifier>EISSN: 1361-8423</identifier><identifier>DOI: 10.1016/j.media.2022.102430</identifier><identifier>PMID: 35397470</identifier><language>eng</language><publisher>Netherlands: Elsevier B.V</publisher><subject>Artificial neural networks ; Biomarkers ; Brain ; Brain - diagnostic imaging ; Brain - physiology ; Brain mapping ; Classification ; Complex-valued fMRI data ; Convolutional neural network ; Functional magnetic resonance imaging ; Grad-CAM ; Humans ; Interpretability ; Magnetic Resonance Imaging - methods ; Mental disorders ; Neural networks ; Neural Networks, Computer ; Salience ; Saliency map ; Schizophrenia ; Schizophrenia - diagnostic imaging ; Spatial source phase</subject><ispartof>Medical image analysis, 2022-07, Vol.79, p.102430, Article 102430</ispartof><rights>2022</rights><rights>Copyright © 2022. Published by Elsevier B.V.</rights><rights>Copyright Elsevier BV Jul 2022</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c432t-8aa8cbe734eb4d0b575853f0cd0712552c5b7c57857bb055ad18de2becebe2363</citedby><cites>FETCH-LOGICAL-c432t-8aa8cbe734eb4d0b575853f0cd0712552c5b7c57857bb055ad18de2becebe2363</cites><orcidid>0000-0003-0145-7136 ; 0000-0003-2659-9501</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://dx.doi.org/10.1016/j.media.2022.102430$$EHTML$$P50$$Gelsevier$$Hfree_for_read</linktohtml><link.rule.ids>314,780,784,3550,27924,27925,45995</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/35397470$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Lin, Qiu-Hua</creatorcontrib><creatorcontrib>Niu, Yan-Wei</creatorcontrib><creatorcontrib>Sui, Jing</creatorcontrib><creatorcontrib>Zhao, Wen-Da</creatorcontrib><creatorcontrib>Zhuo, Chuanjun</creatorcontrib><creatorcontrib>Calhoun, Vince D.</creatorcontrib><title>SSPNet: An interpretable 3D-CNN for classification of schizophrenia using phase maps of resting-state complex-valued fMRI data</title><title>Medical image analysis</title><addtitle>Med Image Anal</addtitle><description>•SSPNet was proposed for schizophrenia classification with interpretability modules.•Phase (SSP) maps were used as 3D-CNN inputs to denoise complex-valued fMRI data.•Saliency maps were generated to provide insight into the relevant brain regions.•Grad-CAM was used to localize decision-making regions within SSP maps.•SSPNet significantly improved classification compared to CNN using magnitude maps.
Convolutional neural networks (CNNs) have shown promising results in classifying individuals with mental disorders such as schizophrenia using resting-state fMRI data. However, complex-valued fMRI data is rarely used since additional phase data introduces high-level noise though it is potentially useful information for the context of classification. As such, we propose to use spatial source phase (SSP) maps derived from complex-valued fMRI data as the CNN input. The SSP maps are not only less noisy, but also more sensitive to spatial activation changes caused by mental disorders than magnitude maps. We build a 3D-CNN framework with two convolutional layers (named SSPNet) to fully explore the 3D structure and voxel-level relationships from the SSP maps. Two interpretability modules, consisting of saliency map generation and gradient-weighted class activation mapping (Grad-CAM), are incorporated into the well-trained SSPNet to provide additional information helpful for understanding the output. Experimental results from classifying schizophrenia patients (SZs) and healthy controls (HCs) show that the proposed SSPNet significantly improved accuracy and AUC compared to CNN using magnitude maps extracted from either magnitude-only (by 23.4 and 23.6% for DMN) or complex-valued fMRI data (by 10.6 and 5.8% for DMN). SSPNet captured more prominent HC-SZ differences in saliency maps, and Grad-CAM localized all contributing brain regions with opposite strengths for HCs and SZs within SSP maps. These results indicate the potential of SSPNet as a sensitive tool that may be useful for the development of brain-based biomarkers of mental disorders.</description><subject>Artificial neural networks</subject><subject>Biomarkers</subject><subject>Brain</subject><subject>Brain - diagnostic imaging</subject><subject>Brain - physiology</subject><subject>Brain mapping</subject><subject>Classification</subject><subject>Complex-valued fMRI data</subject><subject>Convolutional neural network</subject><subject>Functional magnetic resonance imaging</subject><subject>Grad-CAM</subject><subject>Humans</subject><subject>Interpretability</subject><subject>Magnetic Resonance Imaging - methods</subject><subject>Mental disorders</subject><subject>Neural networks</subject><subject>Neural Networks, Computer</subject><subject>Salience</subject><subject>Saliency map</subject><subject>Schizophrenia</subject><subject>Schizophrenia - diagnostic imaging</subject><subject>Spatial source phase</subject><issn>1361-8415</issn><issn>1361-8423</issn><issn>1361-8423</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2022</creationdate><recordtype>article</recordtype><sourceid>EIF</sourceid><recordid>eNp9kU1v1DAQhiNERUvhFyAhS1y4ZBnbcZwgcaiWj1YqC6JwtmxnwnqVxMF2KuDQ3463W3rgwMnW-Jl3Rn6K4hmFFQVav9qtRuycXjFgLFdYxeFBcUJ5TcumYvzh_Z2K4-JxjDsAkFUFj4pjLngrKwknxc3V1ecNptfkbCJuShjmgEmbAQl_W643G9L7QOygY3S9szo5PxHfk2i37reftwEnp8kS3fSdzFsdkYx6jnsiYEy5WsakExLrx3nAn-W1HhbsSP_xywXpdNJPiqNeDxGf3p2nxbf3776uz8vLTx8u1meXpa04S2WjdWMNSl6hqTowQopG8B5sB5IyIZgVRlohGyGNASF0R5sOmUGLBhmv-Wnx8pA7B_9jyaup0UWLw6An9EtUrK6apm0bgIy--Afd-SVMeTvFJKuhBmjbTPEDZYOPMWCv5uBGHX4pCmqvR-3UrR6116MOenLX87vsxeTX-56_PjLw5gBg_oxrh0FF63CyOSmgTarz7r8D_gCCoqGL</recordid><startdate>202207</startdate><enddate>202207</enddate><creator>Lin, Qiu-Hua</creator><creator>Niu, Yan-Wei</creator><creator>Sui, Jing</creator><creator>Zhao, Wen-Da</creator><creator>Zhuo, Chuanjun</creator><creator>Calhoun, Vince D.</creator><general>Elsevier B.V</general><general>Elsevier BV</general><scope>6I.</scope><scope>AAFTH</scope><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7QO</scope><scope>8FD</scope><scope>FR3</scope><scope>K9.</scope><scope>NAPCQ</scope><scope>P64</scope><scope>7X8</scope><orcidid>https://orcid.org/0000-0003-0145-7136</orcidid><orcidid>https://orcid.org/0000-0003-2659-9501</orcidid></search><sort><creationdate>202207</creationdate><title>SSPNet: An interpretable 3D-CNN for classification of schizophrenia using phase maps of resting-state complex-valued fMRI data</title><author>Lin, Qiu-Hua ; Niu, Yan-Wei ; Sui, Jing ; Zhao, Wen-Da ; Zhuo, Chuanjun ; Calhoun, Vince D.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c432t-8aa8cbe734eb4d0b575853f0cd0712552c5b7c57857bb055ad18de2becebe2363</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2022</creationdate><topic>Artificial neural networks</topic><topic>Biomarkers</topic><topic>Brain</topic><topic>Brain - diagnostic imaging</topic><topic>Brain - physiology</topic><topic>Brain mapping</topic><topic>Classification</topic><topic>Complex-valued fMRI data</topic><topic>Convolutional neural network</topic><topic>Functional magnetic resonance imaging</topic><topic>Grad-CAM</topic><topic>Humans</topic><topic>Interpretability</topic><topic>Magnetic Resonance Imaging - methods</topic><topic>Mental disorders</topic><topic>Neural networks</topic><topic>Neural Networks, Computer</topic><topic>Salience</topic><topic>Saliency map</topic><topic>Schizophrenia</topic><topic>Schizophrenia - diagnostic imaging</topic><topic>Spatial source phase</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Lin, Qiu-Hua</creatorcontrib><creatorcontrib>Niu, Yan-Wei</creatorcontrib><creatorcontrib>Sui, Jing</creatorcontrib><creatorcontrib>Zhao, Wen-Da</creatorcontrib><creatorcontrib>Zhuo, Chuanjun</creatorcontrib><creatorcontrib>Calhoun, Vince D.</creatorcontrib><collection>ScienceDirect Open Access Titles</collection><collection>Elsevier:ScienceDirect:Open Access</collection><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>Biotechnology Research Abstracts</collection><collection>Technology Research Database</collection><collection>Engineering Research Database</collection><collection>ProQuest Health & Medical Complete (Alumni)</collection><collection>Nursing & Allied Health Premium</collection><collection>Biotechnology and BioEngineering Abstracts</collection><collection>MEDLINE - Academic</collection><jtitle>Medical image analysis</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Lin, Qiu-Hua</au><au>Niu, Yan-Wei</au><au>Sui, Jing</au><au>Zhao, Wen-Da</au><au>Zhuo, Chuanjun</au><au>Calhoun, Vince D.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>SSPNet: An interpretable 3D-CNN for classification of schizophrenia using phase maps of resting-state complex-valued fMRI data</atitle><jtitle>Medical image analysis</jtitle><addtitle>Med Image Anal</addtitle><date>2022-07</date><risdate>2022</risdate><volume>79</volume><spage>102430</spage><pages>102430-</pages><artnum>102430</artnum><issn>1361-8415</issn><issn>1361-8423</issn><eissn>1361-8423</eissn><abstract>•SSPNet was proposed for schizophrenia classification with interpretability modules.•Phase (SSP) maps were used as 3D-CNN inputs to denoise complex-valued fMRI data.•Saliency maps were generated to provide insight into the relevant brain regions.•Grad-CAM was used to localize decision-making regions within SSP maps.•SSPNet significantly improved classification compared to CNN using magnitude maps.
Convolutional neural networks (CNNs) have shown promising results in classifying individuals with mental disorders such as schizophrenia using resting-state fMRI data. However, complex-valued fMRI data is rarely used since additional phase data introduces high-level noise though it is potentially useful information for the context of classification. As such, we propose to use spatial source phase (SSP) maps derived from complex-valued fMRI data as the CNN input. The SSP maps are not only less noisy, but also more sensitive to spatial activation changes caused by mental disorders than magnitude maps. We build a 3D-CNN framework with two convolutional layers (named SSPNet) to fully explore the 3D structure and voxel-level relationships from the SSP maps. Two interpretability modules, consisting of saliency map generation and gradient-weighted class activation mapping (Grad-CAM), are incorporated into the well-trained SSPNet to provide additional information helpful for understanding the output. Experimental results from classifying schizophrenia patients (SZs) and healthy controls (HCs) show that the proposed SSPNet significantly improved accuracy and AUC compared to CNN using magnitude maps extracted from either magnitude-only (by 23.4 and 23.6% for DMN) or complex-valued fMRI data (by 10.6 and 5.8% for DMN). SSPNet captured more prominent HC-SZ differences in saliency maps, and Grad-CAM localized all contributing brain regions with opposite strengths for HCs and SZs within SSP maps. These results indicate the potential of SSPNet as a sensitive tool that may be useful for the development of brain-based biomarkers of mental disorders.</abstract><cop>Netherlands</cop><pub>Elsevier B.V</pub><pmid>35397470</pmid><doi>10.1016/j.media.2022.102430</doi><orcidid>https://orcid.org/0000-0003-0145-7136</orcidid><orcidid>https://orcid.org/0000-0003-2659-9501</orcidid><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | ISSN: 1361-8415 |
ispartof | Medical image analysis, 2022-07, Vol.79, p.102430, Article 102430 |
issn | 1361-8415 1361-8423 1361-8423 |
language | eng |
recordid | cdi_proquest_miscellaneous_2648899800 |
source | MEDLINE; Access via ScienceDirect (Elsevier) |
subjects | Artificial neural networks Biomarkers Brain Brain - diagnostic imaging Brain - physiology Brain mapping Classification Complex-valued fMRI data Convolutional neural network Functional magnetic resonance imaging Grad-CAM Humans Interpretability Magnetic Resonance Imaging - methods Mental disorders Neural networks Neural Networks, Computer Salience Saliency map Schizophrenia Schizophrenia - diagnostic imaging Spatial source phase |
title | SSPNet: An interpretable 3D-CNN for classification of schizophrenia using phase maps of resting-state complex-valued fMRI data |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-02T14%3A13%3A56IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=SSPNet:%20An%20interpretable%203D-CNN%20for%20classification%20of%20schizophrenia%20using%20phase%20maps%20of%20resting-state%20complex-valued%20fMRI%20data&rft.jtitle=Medical%20image%20analysis&rft.au=Lin,%20Qiu-Hua&rft.date=2022-07&rft.volume=79&rft.spage=102430&rft.pages=102430-&rft.artnum=102430&rft.issn=1361-8415&rft.eissn=1361-8423&rft_id=info:doi/10.1016/j.media.2022.102430&rft_dat=%3Cproquest_cross%3E2726060099%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2726060099&rft_id=info:pmid/35397470&rft_els_id=S1361841522000810&rfr_iscdi=true |