Transfer learning with deep convolutional neural network for constitution classification with face image
Constitution classification is the basis and core content of constitution research in Traditional Chinese medicine. The convolutional neural networks have successfully established many models for image classification, but it requires a lot of training data. In the field of Traditional Chinese medici...
Gespeichert in:
Veröffentlicht in: | Multimedia tools and applications 2020-05, Vol.79 (17-18), p.11905-11919 |
---|---|
Hauptverfasser: | , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 11919 |
---|---|
container_issue | 17-18 |
container_start_page | 11905 |
container_title | Multimedia tools and applications |
container_volume | 79 |
creator | Huan, Er-Yang Wen, Gui-Hua |
description | Constitution classification is the basis and core content of constitution research in Traditional Chinese medicine. The convolutional neural networks have successfully established many models for image classification, but it requires a lot of training data. In the field of Traditional Chinese medicine, the available clinical data is very limited. To solve this problem, we propose a method for constitution classification through transfer learning. Firstly, the DenseNet-169 model trained in ImageNet is applied. Secondly, we carefully modify the DenseNet-169 structure according to the constitution characteristics, and then the modified model is trained in the clinical data to obtain the constitution identification network called ConstitutionNet. In order to further improve the accuracy of classification, we integrate the ConstitutionNet with Vgg-16, Inception v3 and DenseNet-121 to test according to the integrated learning idea, and judge the input face image to its constitution type. The experimental results show that transfer learning can achieve better results in small clinical dataset, and the final accuracy of constitution recognition is 66.79%. |
doi_str_mv | 10.1007/s11042-019-08376-5 |
format | Article |
fullrecord | <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2397280370</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2397280370</sourcerecordid><originalsourceid>FETCH-LOGICAL-c319t-6913b3427a49e35e184fe5958677a8419fe24202630ee59cafdfb01eefd45d03</originalsourceid><addsrcrecordid>eNp9kEtLw0AUhQdRsFb_gKsB16N3XplkKUWtUHDT_TBN7rSpMVNnEov_3jQR3Lm6D75zOBxCbjnccwDzkDgHJRjwgkEuTcb0GZlxbSQzRvDzYZc5MKOBX5KrlPYAPNNCzchuHV2bPEbaoItt3W7pse52tEI80DK0X6Hpuzq0rqEt9nEc3THEd-pDPAGpq7uRoGXjUqp9XbrxHG28K5HWH26L1-TCuybhze-ck_Xz03qxZKu3l9fF44qVkhcdywouN1IJ41SBUiPPlUdd6DwzxuWKFx6FEiAyCTj8S-crvwGO6CulK5BzcjfZHmL47DF1dh_6OMRPVsjCiBykOVFiosoYUoro7SEOKeO35WBPhdqpUDsUasdCrR5EchKlAW63GP-s_1H9APGreqI</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2397280370</pqid></control><display><type>article</type><title>Transfer learning with deep convolutional neural network for constitution classification with face image</title><source>SpringerLink Journals</source><creator>Huan, Er-Yang ; Wen, Gui-Hua</creator><creatorcontrib>Huan, Er-Yang ; Wen, Gui-Hua</creatorcontrib><description>Constitution classification is the basis and core content of constitution research in Traditional Chinese medicine. The convolutional neural networks have successfully established many models for image classification, but it requires a lot of training data. In the field of Traditional Chinese medicine, the available clinical data is very limited. To solve this problem, we propose a method for constitution classification through transfer learning. Firstly, the DenseNet-169 model trained in ImageNet is applied. Secondly, we carefully modify the DenseNet-169 structure according to the constitution characteristics, and then the modified model is trained in the clinical data to obtain the constitution identification network called ConstitutionNet. In order to further improve the accuracy of classification, we integrate the ConstitutionNet with Vgg-16, Inception v3 and DenseNet-121 to test according to the integrated learning idea, and judge the input face image to its constitution type. The experimental results show that transfer learning can achieve better results in small clinical dataset, and the final accuracy of constitution recognition is 66.79%.</description><identifier>ISSN: 1380-7501</identifier><identifier>EISSN: 1573-7721</identifier><identifier>DOI: 10.1007/s11042-019-08376-5</identifier><language>eng</language><publisher>New York: Springer US</publisher><subject>Artificial neural networks ; Chinese medicine ; Classification ; Computer Communication Networks ; Computer Science ; Constitution ; Data Structures and Information Theory ; Image classification ; Learning ; Multimedia Information Systems ; Neural networks ; Special Purpose and Application-Based Systems ; Traditional Chinese medicine</subject><ispartof>Multimedia tools and applications, 2020-05, Vol.79 (17-18), p.11905-11919</ispartof><rights>Springer Science+Business Media, LLC, part of Springer Nature 2020</rights><rights>Springer Science+Business Media, LLC, part of Springer Nature 2020.</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c319t-6913b3427a49e35e184fe5958677a8419fe24202630ee59cafdfb01eefd45d03</citedby><cites>FETCH-LOGICAL-c319t-6913b3427a49e35e184fe5958677a8419fe24202630ee59cafdfb01eefd45d03</cites><orcidid>0000-0002-9709-1126</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://link.springer.com/content/pdf/10.1007/s11042-019-08376-5$$EPDF$$P50$$Gspringer$$H</linktopdf><linktohtml>$$Uhttps://link.springer.com/10.1007/s11042-019-08376-5$$EHTML$$P50$$Gspringer$$H</linktohtml><link.rule.ids>314,776,780,27901,27902,41464,42533,51294</link.rule.ids></links><search><creatorcontrib>Huan, Er-Yang</creatorcontrib><creatorcontrib>Wen, Gui-Hua</creatorcontrib><title>Transfer learning with deep convolutional neural network for constitution classification with face image</title><title>Multimedia tools and applications</title><addtitle>Multimed Tools Appl</addtitle><description>Constitution classification is the basis and core content of constitution research in Traditional Chinese medicine. The convolutional neural networks have successfully established many models for image classification, but it requires a lot of training data. In the field of Traditional Chinese medicine, the available clinical data is very limited. To solve this problem, we propose a method for constitution classification through transfer learning. Firstly, the DenseNet-169 model trained in ImageNet is applied. Secondly, we carefully modify the DenseNet-169 structure according to the constitution characteristics, and then the modified model is trained in the clinical data to obtain the constitution identification network called ConstitutionNet. In order to further improve the accuracy of classification, we integrate the ConstitutionNet with Vgg-16, Inception v3 and DenseNet-121 to test according to the integrated learning idea, and judge the input face image to its constitution type. The experimental results show that transfer learning can achieve better results in small clinical dataset, and the final accuracy of constitution recognition is 66.79%.</description><subject>Artificial neural networks</subject><subject>Chinese medicine</subject><subject>Classification</subject><subject>Computer Communication Networks</subject><subject>Computer Science</subject><subject>Constitution</subject><subject>Data Structures and Information Theory</subject><subject>Image classification</subject><subject>Learning</subject><subject>Multimedia Information Systems</subject><subject>Neural networks</subject><subject>Special Purpose and Application-Based Systems</subject><subject>Traditional Chinese medicine</subject><issn>1380-7501</issn><issn>1573-7721</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2020</creationdate><recordtype>article</recordtype><sourceid>8G5</sourceid><sourceid>BENPR</sourceid><sourceid>GUQSH</sourceid><sourceid>M2O</sourceid><recordid>eNp9kEtLw0AUhQdRsFb_gKsB16N3XplkKUWtUHDT_TBN7rSpMVNnEov_3jQR3Lm6D75zOBxCbjnccwDzkDgHJRjwgkEuTcb0GZlxbSQzRvDzYZc5MKOBX5KrlPYAPNNCzchuHV2bPEbaoItt3W7pse52tEI80DK0X6Hpuzq0rqEt9nEc3THEd-pDPAGpq7uRoGXjUqp9XbrxHG28K5HWH26L1-TCuybhze-ck_Xz03qxZKu3l9fF44qVkhcdywouN1IJ41SBUiPPlUdd6DwzxuWKFx6FEiAyCTj8S-crvwGO6CulK5BzcjfZHmL47DF1dh_6OMRPVsjCiBykOVFiosoYUoro7SEOKeO35WBPhdqpUDsUasdCrR5EchKlAW63GP-s_1H9APGreqI</recordid><startdate>20200501</startdate><enddate>20200501</enddate><creator>Huan, Er-Yang</creator><creator>Wen, Gui-Hua</creator><general>Springer US</general><general>Springer Nature B.V</general><scope>AAYXX</scope><scope>CITATION</scope><scope>3V.</scope><scope>7SC</scope><scope>7WY</scope><scope>7WZ</scope><scope>7XB</scope><scope>87Z</scope><scope>8AL</scope><scope>8AO</scope><scope>8FD</scope><scope>8FE</scope><scope>8FG</scope><scope>8FK</scope><scope>8FL</scope><scope>8G5</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BEZIV</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>FRNLG</scope><scope>F~G</scope><scope>GNUQQ</scope><scope>GUQSH</scope><scope>HCIFZ</scope><scope>JQ2</scope><scope>K60</scope><scope>K6~</scope><scope>K7-</scope><scope>L.-</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>M0C</scope><scope>M0N</scope><scope>M2O</scope><scope>MBDVC</scope><scope>P5Z</scope><scope>P62</scope><scope>PHGZM</scope><scope>PHGZT</scope><scope>PKEHL</scope><scope>PQBIZ</scope><scope>PQBZA</scope><scope>PQEST</scope><scope>PQGLB</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>Q9U</scope><orcidid>https://orcid.org/0000-0002-9709-1126</orcidid></search><sort><creationdate>20200501</creationdate><title>Transfer learning with deep convolutional neural network for constitution classification with face image</title><author>Huan, Er-Yang ; Wen, Gui-Hua</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c319t-6913b3427a49e35e184fe5958677a8419fe24202630ee59cafdfb01eefd45d03</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2020</creationdate><topic>Artificial neural networks</topic><topic>Chinese medicine</topic><topic>Classification</topic><topic>Computer Communication Networks</topic><topic>Computer Science</topic><topic>Constitution</topic><topic>Data Structures and Information Theory</topic><topic>Image classification</topic><topic>Learning</topic><topic>Multimedia Information Systems</topic><topic>Neural networks</topic><topic>Special Purpose and Application-Based Systems</topic><topic>Traditional Chinese medicine</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Huan, Er-Yang</creatorcontrib><creatorcontrib>Wen, Gui-Hua</creatorcontrib><collection>CrossRef</collection><collection>ProQuest Central (Corporate)</collection><collection>Computer and Information Systems Abstracts</collection><collection>ABI/INFORM Collection</collection><collection>ABI/INFORM Global (PDF only)</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>ABI/INFORM Global (Alumni Edition)</collection><collection>Computing Database (Alumni Edition)</collection><collection>ProQuest Pharma Collection</collection><collection>Technology Research Database</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>ABI/INFORM Collection (Alumni Edition)</collection><collection>Research Library (Alumni Edition)</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>Advanced Technologies & Aerospace Collection</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Business Premium Collection</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>Business Premium Collection (Alumni)</collection><collection>ABI/INFORM Global (Corporate)</collection><collection>ProQuest Central Student</collection><collection>Research Library Prep</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Computer Science Collection</collection><collection>ProQuest Business Collection (Alumni Edition)</collection><collection>ProQuest Business Collection</collection><collection>Computer Science Database</collection><collection>ABI/INFORM Professional Advanced</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>ABI/INFORM Global</collection><collection>Computing Database</collection><collection>Research Library</collection><collection>Research Library (Corporate)</collection><collection>Advanced Technologies & Aerospace Database</collection><collection>ProQuest Advanced Technologies & Aerospace Collection</collection><collection>ProQuest Central (New)</collection><collection>ProQuest One Academic (New)</collection><collection>ProQuest One Academic Middle East (New)</collection><collection>ProQuest One Business</collection><collection>ProQuest One Business (Alumni)</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Applied & Life Sciences</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central Basic</collection><jtitle>Multimedia tools and applications</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Huan, Er-Yang</au><au>Wen, Gui-Hua</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Transfer learning with deep convolutional neural network for constitution classification with face image</atitle><jtitle>Multimedia tools and applications</jtitle><stitle>Multimed Tools Appl</stitle><date>2020-05-01</date><risdate>2020</risdate><volume>79</volume><issue>17-18</issue><spage>11905</spage><epage>11919</epage><pages>11905-11919</pages><issn>1380-7501</issn><eissn>1573-7721</eissn><abstract>Constitution classification is the basis and core content of constitution research in Traditional Chinese medicine. The convolutional neural networks have successfully established many models for image classification, but it requires a lot of training data. In the field of Traditional Chinese medicine, the available clinical data is very limited. To solve this problem, we propose a method for constitution classification through transfer learning. Firstly, the DenseNet-169 model trained in ImageNet is applied. Secondly, we carefully modify the DenseNet-169 structure according to the constitution characteristics, and then the modified model is trained in the clinical data to obtain the constitution identification network called ConstitutionNet. In order to further improve the accuracy of classification, we integrate the ConstitutionNet with Vgg-16, Inception v3 and DenseNet-121 to test according to the integrated learning idea, and judge the input face image to its constitution type. The experimental results show that transfer learning can achieve better results in small clinical dataset, and the final accuracy of constitution recognition is 66.79%.</abstract><cop>New York</cop><pub>Springer US</pub><doi>10.1007/s11042-019-08376-5</doi><tpages>15</tpages><orcidid>https://orcid.org/0000-0002-9709-1126</orcidid></addata></record> |
fulltext | fulltext |
identifier | ISSN: 1380-7501 |
ispartof | Multimedia tools and applications, 2020-05, Vol.79 (17-18), p.11905-11919 |
issn | 1380-7501 1573-7721 |
language | eng |
recordid | cdi_proquest_journals_2397280370 |
source | SpringerLink Journals |
subjects | Artificial neural networks Chinese medicine Classification Computer Communication Networks Computer Science Constitution Data Structures and Information Theory Image classification Learning Multimedia Information Systems Neural networks Special Purpose and Application-Based Systems Traditional Chinese medicine |
title | Transfer learning with deep convolutional neural network for constitution classification with face image |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-19T05%3A41%3A00IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Transfer%20learning%20with%20deep%20convolutional%20neural%20network%20for%20constitution%20classification%20with%20face%20image&rft.jtitle=Multimedia%20tools%20and%20applications&rft.au=Huan,%20Er-Yang&rft.date=2020-05-01&rft.volume=79&rft.issue=17-18&rft.spage=11905&rft.epage=11919&rft.pages=11905-11919&rft.issn=1380-7501&rft.eissn=1573-7721&rft_id=info:doi/10.1007/s11042-019-08376-5&rft_dat=%3Cproquest_cross%3E2397280370%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2397280370&rft_id=info:pmid/&rfr_iscdi=true |