SchizoGoogLeNet: The GoogLeNet-Based Deep Feature Extraction Design for Automatic Detection of Schizophrenia

Schizophrenia (SZ) is a severe and prolonged disorder of the human brain where people interpret reality in an abnormal way. Traditional methods of SZ detection are based on handcrafted feature extraction methods (manual process), which are tedious and unsophisticated, and also limited in their abili...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Computational intelligence and neuroscience 2022-09, Vol.2022, p.1992596-13
Hauptverfasser: Siuly, Siuly, Li, Yan, Wen, Peng, Alcin, Omer Faruk
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 13
container_issue
container_start_page 1992596
container_title Computational intelligence and neuroscience
container_volume 2022
creator Siuly, Siuly
Li, Yan
Wen, Peng
Alcin, Omer Faruk
description Schizophrenia (SZ) is a severe and prolonged disorder of the human brain where people interpret reality in an abnormal way. Traditional methods of SZ detection are based on handcrafted feature extraction methods (manual process), which are tedious and unsophisticated, and also limited in their ability to balance efficiency and accuracy. To solve this issue, this study designed a deep learning-based feature extraction scheme involving the GoogLeNet model called “SchizoGoogLeNet” that can efficiently and automatically distinguish schizophrenic patients from healthy control (HC) subjects using electroencephalogram (EEG) signals with improved performance. The proposed framework involves multiple stages of EEG data processing. First, this study employs the average filtering method to remove noise and artifacts from the raw EEG signals to improve the signal-to-noise ratio. After that, a GoogLeNet model is designed to discover significant hidden features from denoised signals to identify schizophrenic patients from HC subjects. Finally, the obtained deep feature set is evaluated by the GoogleNet classifier and also some renowned machine learning classifiers to find a sustainable classification method for the obtained deep feature set. Experimental results show that the proposed deep feature extraction model with a support vector machine performs the best, producing a 99.02% correct classification rate for SZ, with an overall accuracy of 98.84%. Furthermore, our proposed model outperforms other existing methods. The proposed design is able to accurately discriminate SZ from HC, and it will be useful for developing a diagnostic tool for SZ detection.
doi_str_mv 10.1155/2022/1992596
format Article
fullrecord <record><control><sourceid>gale_pubme</sourceid><recordid>TN_cdi_pubmedcentral_primary_oai_pubmedcentral_nih_gov_9477585</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><galeid>A718135678</galeid><sourcerecordid>A718135678</sourcerecordid><originalsourceid>FETCH-LOGICAL-c476t-21fa6bdb1f93195c68646f4cb828ce1703380a271fe973e58e681bc10890768d3</originalsourceid><addsrcrecordid>eNp9kUtv1DAURi0EoqWwY40isUEqoX6MH2GBNC1tQRrBgrK2HOdm4ipjD7bD69fjIcPwWLCyfX10fK8_hB4T_IIQzs8opvSMNA3ljbiDjolQsuZUsruHveBH6EFKtxhzyTG9j46YIBQLKY7R-MEO7nu4DmG9gneQX1Y3A1SHY31uEnTVa4BtdQUmTxGqy685Gptd8KWe3NpXfYjVcsphY7KzpZhhvg59Neu3QwTvzEN0rzdjgkf79QR9vLq8uXhTr95fv71Yrmq7kCLXlPRGtF1L-oaRhluhxEL0C9sqqiwQiRlT2FBJemgkA65AKNJaglWDpVAdO0GvZu92ajfQWfCl41Fvo9uY-E0H4_TfN94Neh0-62YhJVe8CJ7tBTF8miBlvXHJwjgaD2FKurzNpVJc0YI-_Qe9DVP0ZbyfFGMNU-Q3tTYjaOf7sPvDnVQvJSkAF1IV6vlM2RhSitAfWiZY78LWu7D1PuyCP_lzzAP8K90CnM7A4Hxnvrj_634AuNCwNQ</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2715339381</pqid></control><display><type>article</type><title>SchizoGoogLeNet: The GoogLeNet-Based Deep Feature Extraction Design for Automatic Detection of Schizophrenia</title><source>MEDLINE</source><source>PubMed Central Open Access</source><source>EZB-FREE-00999 freely available EZB journals</source><source>Wiley Online Library (Open Access Collection)</source><source>PubMed Central</source><source>Alma/SFX Local Collection</source><creator>Siuly, Siuly ; Li, Yan ; Wen, Peng ; Alcin, Omer Faruk</creator><contributor>Javed, Abdul Rehman ; Abdul Rehman Javed</contributor><creatorcontrib>Siuly, Siuly ; Li, Yan ; Wen, Peng ; Alcin, Omer Faruk ; Javed, Abdul Rehman ; Abdul Rehman Javed</creatorcontrib><description>Schizophrenia (SZ) is a severe and prolonged disorder of the human brain where people interpret reality in an abnormal way. Traditional methods of SZ detection are based on handcrafted feature extraction methods (manual process), which are tedious and unsophisticated, and also limited in their ability to balance efficiency and accuracy. To solve this issue, this study designed a deep learning-based feature extraction scheme involving the GoogLeNet model called “SchizoGoogLeNet” that can efficiently and automatically distinguish schizophrenic patients from healthy control (HC) subjects using electroencephalogram (EEG) signals with improved performance. The proposed framework involves multiple stages of EEG data processing. First, this study employs the average filtering method to remove noise and artifacts from the raw EEG signals to improve the signal-to-noise ratio. After that, a GoogLeNet model is designed to discover significant hidden features from denoised signals to identify schizophrenic patients from HC subjects. Finally, the obtained deep feature set is evaluated by the GoogleNet classifier and also some renowned machine learning classifiers to find a sustainable classification method for the obtained deep feature set. Experimental results show that the proposed deep feature extraction model with a support vector machine performs the best, producing a 99.02% correct classification rate for SZ, with an overall accuracy of 98.84%. Furthermore, our proposed model outperforms other existing methods. The proposed design is able to accurately discriminate SZ from HC, and it will be useful for developing a diagnostic tool for SZ detection.</description><identifier>ISSN: 1687-5265</identifier><identifier>ISSN: 1687-5273</identifier><identifier>EISSN: 1687-5273</identifier><identifier>DOI: 10.1155/2022/1992596</identifier><identifier>PMID: 36120676</identifier><language>eng</language><publisher>United States: Hindawi</publisher><subject>Accuracy ; Classification ; Classifiers ; Data processing ; Deep learning ; Discriminant analysis ; EEG ; Electroencephalography ; Electroencephalography - methods ; Feature extraction ; Human error ; Humans ; Machine Learning ; Mental disorders ; Schizophrenia ; Schizophrenia - diagnosis ; Signal Processing, Computer-Assisted ; Signal to noise ratio ; Support Vector Machine ; Support vector machines ; Tomography</subject><ispartof>Computational intelligence and neuroscience, 2022-09, Vol.2022, p.1992596-13</ispartof><rights>Copyright © 2022 Siuly Siuly et al.</rights><rights>COPYRIGHT 2022 John Wiley &amp; Sons, Inc.</rights><rights>Copyright © 2022 Siuly Siuly et al. This is an open access article distributed under the Creative Commons Attribution License (the “License”), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License. https://creativecommons.org/licenses/by/4.0</rights><rights>Copyright © 2022 Siuly Siuly et al. 2022</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c476t-21fa6bdb1f93195c68646f4cb828ce1703380a271fe973e58e681bc10890768d3</citedby><cites>FETCH-LOGICAL-c476t-21fa6bdb1f93195c68646f4cb828ce1703380a271fe973e58e681bc10890768d3</cites><orcidid>0000-0002-4694-4926 ; 0000-0003-0939-9145 ; 0000-0003-2491-0546</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://www.ncbi.nlm.nih.gov/pmc/articles/PMC9477585/pdf/$$EPDF$$P50$$Gpubmedcentral$$Hfree_for_read</linktopdf><linktohtml>$$Uhttps://www.ncbi.nlm.nih.gov/pmc/articles/PMC9477585/$$EHTML$$P50$$Gpubmedcentral$$Hfree_for_read</linktohtml><link.rule.ids>230,314,727,780,784,885,27924,27925,53791,53793</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/36120676$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><contributor>Javed, Abdul Rehman</contributor><contributor>Abdul Rehman Javed</contributor><creatorcontrib>Siuly, Siuly</creatorcontrib><creatorcontrib>Li, Yan</creatorcontrib><creatorcontrib>Wen, Peng</creatorcontrib><creatorcontrib>Alcin, Omer Faruk</creatorcontrib><title>SchizoGoogLeNet: The GoogLeNet-Based Deep Feature Extraction Design for Automatic Detection of Schizophrenia</title><title>Computational intelligence and neuroscience</title><addtitle>Comput Intell Neurosci</addtitle><description>Schizophrenia (SZ) is a severe and prolonged disorder of the human brain where people interpret reality in an abnormal way. Traditional methods of SZ detection are based on handcrafted feature extraction methods (manual process), which are tedious and unsophisticated, and also limited in their ability to balance efficiency and accuracy. To solve this issue, this study designed a deep learning-based feature extraction scheme involving the GoogLeNet model called “SchizoGoogLeNet” that can efficiently and automatically distinguish schizophrenic patients from healthy control (HC) subjects using electroencephalogram (EEG) signals with improved performance. The proposed framework involves multiple stages of EEG data processing. First, this study employs the average filtering method to remove noise and artifacts from the raw EEG signals to improve the signal-to-noise ratio. After that, a GoogLeNet model is designed to discover significant hidden features from denoised signals to identify schizophrenic patients from HC subjects. Finally, the obtained deep feature set is evaluated by the GoogleNet classifier and also some renowned machine learning classifiers to find a sustainable classification method for the obtained deep feature set. Experimental results show that the proposed deep feature extraction model with a support vector machine performs the best, producing a 99.02% correct classification rate for SZ, with an overall accuracy of 98.84%. Furthermore, our proposed model outperforms other existing methods. The proposed design is able to accurately discriminate SZ from HC, and it will be useful for developing a diagnostic tool for SZ detection.</description><subject>Accuracy</subject><subject>Classification</subject><subject>Classifiers</subject><subject>Data processing</subject><subject>Deep learning</subject><subject>Discriminant analysis</subject><subject>EEG</subject><subject>Electroencephalography</subject><subject>Electroencephalography - methods</subject><subject>Feature extraction</subject><subject>Human error</subject><subject>Humans</subject><subject>Machine Learning</subject><subject>Mental disorders</subject><subject>Schizophrenia</subject><subject>Schizophrenia - diagnosis</subject><subject>Signal Processing, Computer-Assisted</subject><subject>Signal to noise ratio</subject><subject>Support Vector Machine</subject><subject>Support vector machines</subject><subject>Tomography</subject><issn>1687-5265</issn><issn>1687-5273</issn><issn>1687-5273</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2022</creationdate><recordtype>article</recordtype><sourceid>RHX</sourceid><sourceid>EIF</sourceid><sourceid>ABUWG</sourceid><sourceid>AFKRA</sourceid><sourceid>AZQEC</sourceid><sourceid>BENPR</sourceid><sourceid>CCPQU</sourceid><sourceid>DWQXO</sourceid><sourceid>GNUQQ</sourceid><recordid>eNp9kUtv1DAURi0EoqWwY40isUEqoX6MH2GBNC1tQRrBgrK2HOdm4ipjD7bD69fjIcPwWLCyfX10fK8_hB4T_IIQzs8opvSMNA3ljbiDjolQsuZUsruHveBH6EFKtxhzyTG9j46YIBQLKY7R-MEO7nu4DmG9gneQX1Y3A1SHY31uEnTVa4BtdQUmTxGqy685Gptd8KWe3NpXfYjVcsphY7KzpZhhvg59Neu3QwTvzEN0rzdjgkf79QR9vLq8uXhTr95fv71Yrmq7kCLXlPRGtF1L-oaRhluhxEL0C9sqqiwQiRlT2FBJemgkA65AKNJaglWDpVAdO0GvZu92ajfQWfCl41Fvo9uY-E0H4_TfN94Neh0-62YhJVe8CJ7tBTF8miBlvXHJwjgaD2FKurzNpVJc0YI-_Qe9DVP0ZbyfFGMNU-Q3tTYjaOf7sPvDnVQvJSkAF1IV6vlM2RhSitAfWiZY78LWu7D1PuyCP_lzzAP8K90CnM7A4Hxnvrj_634AuNCwNQ</recordid><startdate>20220908</startdate><enddate>20220908</enddate><creator>Siuly, Siuly</creator><creator>Li, Yan</creator><creator>Wen, Peng</creator><creator>Alcin, Omer Faruk</creator><general>Hindawi</general><general>John Wiley &amp; Sons, Inc</general><general>Hindawi Limited</general><scope>RHU</scope><scope>RHW</scope><scope>RHX</scope><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>3V.</scope><scope>7QF</scope><scope>7QQ</scope><scope>7SC</scope><scope>7SE</scope><scope>7SP</scope><scope>7SR</scope><scope>7TA</scope><scope>7TB</scope><scope>7TK</scope><scope>7U5</scope><scope>7X7</scope><scope>7XB</scope><scope>8AL</scope><scope>8BQ</scope><scope>8FD</scope><scope>8FE</scope><scope>8FG</scope><scope>8FH</scope><scope>8FI</scope><scope>8FJ</scope><scope>8FK</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>AZQEC</scope><scope>BBNVY</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>BHPHI</scope><scope>CCPQU</scope><scope>CWDGH</scope><scope>DWQXO</scope><scope>F28</scope><scope>FR3</scope><scope>FYUFA</scope><scope>GHDGH</scope><scope>GNUQQ</scope><scope>H8D</scope><scope>H8G</scope><scope>HCIFZ</scope><scope>JG9</scope><scope>JQ2</scope><scope>K7-</scope><scope>K9.</scope><scope>KR7</scope><scope>L6V</scope><scope>L7M</scope><scope>LK8</scope><scope>L~C</scope><scope>L~D</scope><scope>M0N</scope><scope>M0S</scope><scope>M1P</scope><scope>M7P</scope><scope>M7S</scope><scope>P5Z</scope><scope>P62</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PSYQQ</scope><scope>PTHSS</scope><scope>Q9U</scope><scope>7X8</scope><scope>5PM</scope><orcidid>https://orcid.org/0000-0002-4694-4926</orcidid><orcidid>https://orcid.org/0000-0003-0939-9145</orcidid><orcidid>https://orcid.org/0000-0003-2491-0546</orcidid></search><sort><creationdate>20220908</creationdate><title>SchizoGoogLeNet: The GoogLeNet-Based Deep Feature Extraction Design for Automatic Detection of Schizophrenia</title><author>Siuly, Siuly ; Li, Yan ; Wen, Peng ; Alcin, Omer Faruk</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c476t-21fa6bdb1f93195c68646f4cb828ce1703380a271fe973e58e681bc10890768d3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2022</creationdate><topic>Accuracy</topic><topic>Classification</topic><topic>Classifiers</topic><topic>Data processing</topic><topic>Deep learning</topic><topic>Discriminant analysis</topic><topic>EEG</topic><topic>Electroencephalography</topic><topic>Electroencephalography - methods</topic><topic>Feature extraction</topic><topic>Human error</topic><topic>Humans</topic><topic>Machine Learning</topic><topic>Mental disorders</topic><topic>Schizophrenia</topic><topic>Schizophrenia - diagnosis</topic><topic>Signal Processing, Computer-Assisted</topic><topic>Signal to noise ratio</topic><topic>Support Vector Machine</topic><topic>Support vector machines</topic><topic>Tomography</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Siuly, Siuly</creatorcontrib><creatorcontrib>Li, Yan</creatorcontrib><creatorcontrib>Wen, Peng</creatorcontrib><creatorcontrib>Alcin, Omer Faruk</creatorcontrib><collection>Hindawi Publishing Complete</collection><collection>Hindawi Publishing Subscription Journals</collection><collection>Hindawi Publishing Open Access Journals</collection><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>ProQuest Central (Corporate)</collection><collection>Aluminium Industry Abstracts</collection><collection>Ceramic Abstracts</collection><collection>Computer and Information Systems Abstracts</collection><collection>Corrosion Abstracts</collection><collection>Electronics &amp; Communications Abstracts</collection><collection>Engineered Materials Abstracts</collection><collection>Materials Business File</collection><collection>Mechanical &amp; Transportation Engineering Abstracts</collection><collection>Neurosciences Abstracts</collection><collection>Solid State and Superconductivity Abstracts</collection><collection>Health &amp; Medical Collection</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>Computing Database (Alumni Edition)</collection><collection>METADEX</collection><collection>Technology Research Database</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>ProQuest Natural Science Collection</collection><collection>Hospital Premium Collection</collection><collection>Hospital Premium Collection (Alumni Edition)</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>Materials Science &amp; Engineering Collection</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>Advanced Technologies &amp; Aerospace Collection</collection><collection>ProQuest Central Essentials</collection><collection>Biological Science Collection</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>Natural Science Collection</collection><collection>ProQuest One Community College</collection><collection>Middle East &amp; Africa Database</collection><collection>ProQuest Central Korea</collection><collection>ANTE: Abstracts in New Technology &amp; Engineering</collection><collection>Engineering Research Database</collection><collection>Health Research Premium Collection</collection><collection>Health Research Premium Collection (Alumni)</collection><collection>ProQuest Central Student</collection><collection>Aerospace Database</collection><collection>Copper Technical Reference Library</collection><collection>SciTech Premium Collection</collection><collection>Materials Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Computer Science Database</collection><collection>ProQuest Health &amp; Medical Complete (Alumni)</collection><collection>Civil Engineering Abstracts</collection><collection>ProQuest Engineering Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>ProQuest Biological Science Collection</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>Computing Database</collection><collection>Health &amp; Medical Collection (Alumni Edition)</collection><collection>Medical Database</collection><collection>Biological Science Database</collection><collection>Engineering Database</collection><collection>Advanced Technologies &amp; Aerospace Database</collection><collection>ProQuest Advanced Technologies &amp; Aerospace Collection</collection><collection>Publicly Available Content Database</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>ProQuest One Psychology</collection><collection>Engineering Collection</collection><collection>ProQuest Central Basic</collection><collection>MEDLINE - Academic</collection><collection>PubMed Central (Full Participant titles)</collection><jtitle>Computational intelligence and neuroscience</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Siuly, Siuly</au><au>Li, Yan</au><au>Wen, Peng</au><au>Alcin, Omer Faruk</au><au>Javed, Abdul Rehman</au><au>Abdul Rehman Javed</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>SchizoGoogLeNet: The GoogLeNet-Based Deep Feature Extraction Design for Automatic Detection of Schizophrenia</atitle><jtitle>Computational intelligence and neuroscience</jtitle><addtitle>Comput Intell Neurosci</addtitle><date>2022-09-08</date><risdate>2022</risdate><volume>2022</volume><spage>1992596</spage><epage>13</epage><pages>1992596-13</pages><issn>1687-5265</issn><issn>1687-5273</issn><eissn>1687-5273</eissn><abstract>Schizophrenia (SZ) is a severe and prolonged disorder of the human brain where people interpret reality in an abnormal way. Traditional methods of SZ detection are based on handcrafted feature extraction methods (manual process), which are tedious and unsophisticated, and also limited in their ability to balance efficiency and accuracy. To solve this issue, this study designed a deep learning-based feature extraction scheme involving the GoogLeNet model called “SchizoGoogLeNet” that can efficiently and automatically distinguish schizophrenic patients from healthy control (HC) subjects using electroencephalogram (EEG) signals with improved performance. The proposed framework involves multiple stages of EEG data processing. First, this study employs the average filtering method to remove noise and artifacts from the raw EEG signals to improve the signal-to-noise ratio. After that, a GoogLeNet model is designed to discover significant hidden features from denoised signals to identify schizophrenic patients from HC subjects. Finally, the obtained deep feature set is evaluated by the GoogleNet classifier and also some renowned machine learning classifiers to find a sustainable classification method for the obtained deep feature set. Experimental results show that the proposed deep feature extraction model with a support vector machine performs the best, producing a 99.02% correct classification rate for SZ, with an overall accuracy of 98.84%. Furthermore, our proposed model outperforms other existing methods. The proposed design is able to accurately discriminate SZ from HC, and it will be useful for developing a diagnostic tool for SZ detection.</abstract><cop>United States</cop><pub>Hindawi</pub><pmid>36120676</pmid><doi>10.1155/2022/1992596</doi><tpages>13</tpages><orcidid>https://orcid.org/0000-0002-4694-4926</orcidid><orcidid>https://orcid.org/0000-0003-0939-9145</orcidid><orcidid>https://orcid.org/0000-0003-2491-0546</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 1687-5265
ispartof Computational intelligence and neuroscience, 2022-09, Vol.2022, p.1992596-13
issn 1687-5265
1687-5273
1687-5273
language eng
recordid cdi_pubmedcentral_primary_oai_pubmedcentral_nih_gov_9477585
source MEDLINE; PubMed Central Open Access; EZB-FREE-00999 freely available EZB journals; Wiley Online Library (Open Access Collection); PubMed Central; Alma/SFX Local Collection
subjects Accuracy
Classification
Classifiers
Data processing
Deep learning
Discriminant analysis
EEG
Electroencephalography
Electroencephalography - methods
Feature extraction
Human error
Humans
Machine Learning
Mental disorders
Schizophrenia
Schizophrenia - diagnosis
Signal Processing, Computer-Assisted
Signal to noise ratio
Support Vector Machine
Support vector machines
Tomography
title SchizoGoogLeNet: The GoogLeNet-Based Deep Feature Extraction Design for Automatic Detection of Schizophrenia
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-22T06%3A22%3A43IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-gale_pubme&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=SchizoGoogLeNet:%20The%20GoogLeNet-Based%20Deep%20Feature%20Extraction%20Design%20for%20Automatic%20Detection%20of%20Schizophrenia&rft.jtitle=Computational%20intelligence%20and%20neuroscience&rft.au=Siuly,%20Siuly&rft.date=2022-09-08&rft.volume=2022&rft.spage=1992596&rft.epage=13&rft.pages=1992596-13&rft.issn=1687-5265&rft.eissn=1687-5273&rft_id=info:doi/10.1155/2022/1992596&rft_dat=%3Cgale_pubme%3EA718135678%3C/gale_pubme%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2715339381&rft_id=info:pmid/36120676&rft_galeid=A718135678&rfr_iscdi=true