Deep Iris: Deep Learning for Gender Classification Through Iris Patterns
One attractive research area in the computer science field is soft biometrics. To Identify a person's gender from an iris image when such identification is related to security surveillance systems and forensics applications. In this paper, a robust iris gender-identification method based on a d...
Gespeichert in:
Veröffentlicht in: | Acta informatica medica 2019-06, Vol.27 (2), p.96-102 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 102 |
---|---|
container_issue | 2 |
container_start_page | 96 |
container_title | Acta informatica medica |
container_volume | 27 |
creator | Khalifa, Nour Eldeen M Taha, Mohamed Hamed N Hassanien, Aboul Ella Mohamed, Hamed Nasr Eldin T |
description | One attractive research area in the computer science field is soft biometrics.
To Identify a person's gender from an iris image when such identification is related to security surveillance systems and forensics applications.
In this paper, a robust iris gender-identification method based on a deep convolutional neural network is introduced. The proposed architecture segments the iris from a background image using the graph-cut segmentation technique. The proposed model contains 16 subsequent layers; three are convolutional layers for feature extraction with different convolution window sizes, followed by three fully connected layers for classification.
The original dataset consists of 3,000 images, 1,500 images for men and 1,500 images for women. The augmentation techniques adopted in this research overcome the overfitting problem and make the proposed architecture more robust and immune from simply memorizing the training data. In addition, the augmentation process not only increased the number of dataset images to 9,000 images for the training phase, 3,000 images for the testing phase and 3,000 images for the verification phase but also led to a significant improvement in testing accuracy, where the proposed architecture achieved 98.88%. A comparison is presented in which the testing accuracy of the proposed approach was compared with the testing accuracy of other related works using the same dataset.
The proposed architecture outperformed the other related works in terms of testing accuracy. |
doi_str_mv | 10.5455/aim.2019.27.96-102 |
format | Article |
fullrecord | <record><control><sourceid>proquest_pubme</sourceid><recordid>TN_cdi_pubmedcentral_primary_oai_pubmedcentral_nih_gov_6689381</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2281109499</sourcerecordid><originalsourceid>FETCH-LOGICAL-c3452-2264f26cf096b92d765cd93dea6f8915aa502fdd54461e00670899b236c87c3f3</originalsourceid><addsrcrecordid>eNpdUU1vGyEQRVWixnH6B3qoVsqll3VhWFjooVLkfEqWkkN6RpgFG2sNDuxW6r8vjt0o7WlGM--9-XgIfSZ4xhrGvmm_nQEmcgbtTPKaYPiAJkQKXjMpxAmaYMpoLQiWZ-g85w3GjAFuP6IzShoGjPMJur-2dlc9JJ-_V6_pwuoUfFhVLqbqzobOpmre65y980YPPobqeZ3iuFq_sqonPQw2hXyBTp3us_10jFP08_bmeX5fLx7vHuZXi9rQMrMG4I0DbhyWfCmhazkznaSd1dwJSZjWDIPrOtY0nFiMeYuFlEug3IjWUEen6MdBdzcut7YzNgxJ92qX_Fan3ypqr_7tBL9Wq_hLcS4kFaQIfD0KpPgy2jyorc_G9r0ONo5ZAQhSXtZIWaCX_0E3cUyhnKeAlt1I-SEUFBxQJsWck3VvyxCs9kapYpTaG6WgVZKX8p705f0Zb5S_ztA_bB6OFA</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2300612562</pqid></control><display><type>article</type><title>Deep Iris: Deep Learning for Gender Classification Through Iris Patterns</title><source>EZB-FREE-00999 freely available EZB journals</source><source>PubMed Central</source><creator>Khalifa, Nour Eldeen M ; Taha, Mohamed Hamed N ; Hassanien, Aboul Ella ; Mohamed, Hamed Nasr Eldin T</creator><creatorcontrib>Khalifa, Nour Eldeen M ; Taha, Mohamed Hamed N ; Hassanien, Aboul Ella ; Mohamed, Hamed Nasr Eldin T</creatorcontrib><description>One attractive research area in the computer science field is soft biometrics.
To Identify a person's gender from an iris image when such identification is related to security surveillance systems and forensics applications.
In this paper, a robust iris gender-identification method based on a deep convolutional neural network is introduced. The proposed architecture segments the iris from a background image using the graph-cut segmentation technique. The proposed model contains 16 subsequent layers; three are convolutional layers for feature extraction with different convolution window sizes, followed by three fully connected layers for classification.
The original dataset consists of 3,000 images, 1,500 images for men and 1,500 images for women. The augmentation techniques adopted in this research overcome the overfitting problem and make the proposed architecture more robust and immune from simply memorizing the training data. In addition, the augmentation process not only increased the number of dataset images to 9,000 images for the training phase, 3,000 images for the testing phase and 3,000 images for the verification phase but also led to a significant improvement in testing accuracy, where the proposed architecture achieved 98.88%. A comparison is presented in which the testing accuracy of the proposed approach was compared with the testing accuracy of other related works using the same dataset.
The proposed architecture outperformed the other related works in terms of testing accuracy.</description><identifier>ISSN: 0353-8109</identifier><identifier>EISSN: 1986-5988</identifier><identifier>DOI: 10.5455/aim.2019.27.96-102</identifier><identifier>PMID: 31452566</identifier><language>eng</language><publisher>Bosnia and Herzegovina: Academy of Medical Sciences of Bosnia and Herzegovina</publisher><subject>Accuracy ; Artificial neural networks ; Augmentation ; Biometrics ; Convolution ; Datasets ; Deep learning ; Feature extraction ; Gender ; Identification methods ; Image classification ; Image segmentation ; Original Paper ; Surveillance systems ; Training</subject><ispartof>Acta informatica medica, 2019-06, Vol.27 (2), p.96-102</ispartof><rights>2019. This work is licensed under https://creativecommons.org/licenses/by-nc-sa/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><rights>2019 Nour Eldeen M. Khalifa, Mohamed Hamed N. Taha, Aboul Ella Hassanien, Hamed Nasr Eldin T. Mohamed 2019</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c3452-2264f26cf096b92d765cd93dea6f8915aa502fdd54461e00670899b236c87c3f3</citedby></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://www.ncbi.nlm.nih.gov/pmc/articles/PMC6689381/pdf/$$EPDF$$P50$$Gpubmedcentral$$Hfree_for_read</linktopdf><linktohtml>$$Uhttps://www.ncbi.nlm.nih.gov/pmc/articles/PMC6689381/$$EHTML$$P50$$Gpubmedcentral$$Hfree_for_read</linktohtml><link.rule.ids>230,314,727,780,784,885,27924,27925,53791,53793</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/31452566$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Khalifa, Nour Eldeen M</creatorcontrib><creatorcontrib>Taha, Mohamed Hamed N</creatorcontrib><creatorcontrib>Hassanien, Aboul Ella</creatorcontrib><creatorcontrib>Mohamed, Hamed Nasr Eldin T</creatorcontrib><title>Deep Iris: Deep Learning for Gender Classification Through Iris Patterns</title><title>Acta informatica medica</title><addtitle>Acta Inform Med</addtitle><description>One attractive research area in the computer science field is soft biometrics.
To Identify a person's gender from an iris image when such identification is related to security surveillance systems and forensics applications.
In this paper, a robust iris gender-identification method based on a deep convolutional neural network is introduced. The proposed architecture segments the iris from a background image using the graph-cut segmentation technique. The proposed model contains 16 subsequent layers; three are convolutional layers for feature extraction with different convolution window sizes, followed by three fully connected layers for classification.
The original dataset consists of 3,000 images, 1,500 images for men and 1,500 images for women. The augmentation techniques adopted in this research overcome the overfitting problem and make the proposed architecture more robust and immune from simply memorizing the training data. In addition, the augmentation process not only increased the number of dataset images to 9,000 images for the training phase, 3,000 images for the testing phase and 3,000 images for the verification phase but also led to a significant improvement in testing accuracy, where the proposed architecture achieved 98.88%. A comparison is presented in which the testing accuracy of the proposed approach was compared with the testing accuracy of other related works using the same dataset.
The proposed architecture outperformed the other related works in terms of testing accuracy.</description><subject>Accuracy</subject><subject>Artificial neural networks</subject><subject>Augmentation</subject><subject>Biometrics</subject><subject>Convolution</subject><subject>Datasets</subject><subject>Deep learning</subject><subject>Feature extraction</subject><subject>Gender</subject><subject>Identification methods</subject><subject>Image classification</subject><subject>Image segmentation</subject><subject>Original Paper</subject><subject>Surveillance systems</subject><subject>Training</subject><issn>0353-8109</issn><issn>1986-5988</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2019</creationdate><recordtype>article</recordtype><sourceid>ABUWG</sourceid><sourceid>AFKRA</sourceid><sourceid>AZQEC</sourceid><sourceid>BENPR</sourceid><sourceid>CCPQU</sourceid><sourceid>DWQXO</sourceid><sourceid>GNUQQ</sourceid><recordid>eNpdUU1vGyEQRVWixnH6B3qoVsqll3VhWFjooVLkfEqWkkN6RpgFG2sNDuxW6r8vjt0o7WlGM--9-XgIfSZ4xhrGvmm_nQEmcgbtTPKaYPiAJkQKXjMpxAmaYMpoLQiWZ-g85w3GjAFuP6IzShoGjPMJur-2dlc9JJ-_V6_pwuoUfFhVLqbqzobOpmre65y980YPPobqeZ3iuFq_sqonPQw2hXyBTp3us_10jFP08_bmeX5fLx7vHuZXi9rQMrMG4I0DbhyWfCmhazkznaSd1dwJSZjWDIPrOtY0nFiMeYuFlEug3IjWUEen6MdBdzcut7YzNgxJ92qX_Fan3ypqr_7tBL9Wq_hLcS4kFaQIfD0KpPgy2jyorc_G9r0ONo5ZAQhSXtZIWaCX_0E3cUyhnKeAlt1I-SEUFBxQJsWck3VvyxCs9kapYpTaG6WgVZKX8p705f0Zb5S_ztA_bB6OFA</recordid><startdate>20190601</startdate><enddate>20190601</enddate><creator>Khalifa, Nour Eldeen M</creator><creator>Taha, Mohamed Hamed N</creator><creator>Hassanien, Aboul Ella</creator><creator>Mohamed, Hamed Nasr Eldin T</creator><general>Academy of Medical Sciences of Bosnia and Herzegovina</general><general>Academy of Medical sciences</general><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>3V.</scope><scope>7X7</scope><scope>7XB</scope><scope>88C</scope><scope>8AL</scope><scope>8FE</scope><scope>8FG</scope><scope>8FI</scope><scope>8FJ</scope><scope>8FK</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>BYOGL</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>FYUFA</scope><scope>GHDGH</scope><scope>GNUQQ</scope><scope>HCIFZ</scope><scope>JQ2</scope><scope>K7-</scope><scope>K9.</scope><scope>M0N</scope><scope>M0S</scope><scope>M0T</scope><scope>P5Z</scope><scope>P62</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>Q9U</scope><scope>7X8</scope><scope>5PM</scope></search><sort><creationdate>20190601</creationdate><title>Deep Iris: Deep Learning for Gender Classification Through Iris Patterns</title><author>Khalifa, Nour Eldeen M ; Taha, Mohamed Hamed N ; Hassanien, Aboul Ella ; Mohamed, Hamed Nasr Eldin T</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c3452-2264f26cf096b92d765cd93dea6f8915aa502fdd54461e00670899b236c87c3f3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2019</creationdate><topic>Accuracy</topic><topic>Artificial neural networks</topic><topic>Augmentation</topic><topic>Biometrics</topic><topic>Convolution</topic><topic>Datasets</topic><topic>Deep learning</topic><topic>Feature extraction</topic><topic>Gender</topic><topic>Identification methods</topic><topic>Image classification</topic><topic>Image segmentation</topic><topic>Original Paper</topic><topic>Surveillance systems</topic><topic>Training</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Khalifa, Nour Eldeen M</creatorcontrib><creatorcontrib>Taha, Mohamed Hamed N</creatorcontrib><creatorcontrib>Hassanien, Aboul Ella</creatorcontrib><creatorcontrib>Mohamed, Hamed Nasr Eldin T</creatorcontrib><collection>PubMed</collection><collection>CrossRef</collection><collection>ProQuest Central (Corporate)</collection><collection>Health & Medical Collection</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>Healthcare Administration Database (Alumni)</collection><collection>Computing Database (Alumni Edition)</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>Hospital Premium Collection</collection><collection>Hospital Premium Collection (Alumni Edition)</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>Advanced Technologies & Aerospace Collection</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>East Europe, Central Europe Database</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>Health Research Premium Collection</collection><collection>Health Research Premium Collection (Alumni)</collection><collection>ProQuest Central Student</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Computer Science Collection</collection><collection>Computer Science Database</collection><collection>ProQuest Health & Medical Complete (Alumni)</collection><collection>Computing Database</collection><collection>Health & Medical Collection (Alumni Edition)</collection><collection>Healthcare Administration Database</collection><collection>Advanced Technologies & Aerospace Database</collection><collection>ProQuest Advanced Technologies & Aerospace Collection</collection><collection>Access via ProQuest (Open Access)</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>ProQuest Central Basic</collection><collection>MEDLINE - Academic</collection><collection>PubMed Central (Full Participant titles)</collection><jtitle>Acta informatica medica</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Khalifa, Nour Eldeen M</au><au>Taha, Mohamed Hamed N</au><au>Hassanien, Aboul Ella</au><au>Mohamed, Hamed Nasr Eldin T</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Deep Iris: Deep Learning for Gender Classification Through Iris Patterns</atitle><jtitle>Acta informatica medica</jtitle><addtitle>Acta Inform Med</addtitle><date>2019-06-01</date><risdate>2019</risdate><volume>27</volume><issue>2</issue><spage>96</spage><epage>102</epage><pages>96-102</pages><issn>0353-8109</issn><eissn>1986-5988</eissn><abstract>One attractive research area in the computer science field is soft biometrics.
To Identify a person's gender from an iris image when such identification is related to security surveillance systems and forensics applications.
In this paper, a robust iris gender-identification method based on a deep convolutional neural network is introduced. The proposed architecture segments the iris from a background image using the graph-cut segmentation technique. The proposed model contains 16 subsequent layers; three are convolutional layers for feature extraction with different convolution window sizes, followed by three fully connected layers for classification.
The original dataset consists of 3,000 images, 1,500 images for men and 1,500 images for women. The augmentation techniques adopted in this research overcome the overfitting problem and make the proposed architecture more robust and immune from simply memorizing the training data. In addition, the augmentation process not only increased the number of dataset images to 9,000 images for the training phase, 3,000 images for the testing phase and 3,000 images for the verification phase but also led to a significant improvement in testing accuracy, where the proposed architecture achieved 98.88%. A comparison is presented in which the testing accuracy of the proposed approach was compared with the testing accuracy of other related works using the same dataset.
The proposed architecture outperformed the other related works in terms of testing accuracy.</abstract><cop>Bosnia and Herzegovina</cop><pub>Academy of Medical Sciences of Bosnia and Herzegovina</pub><pmid>31452566</pmid><doi>10.5455/aim.2019.27.96-102</doi><tpages>7</tpages><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | ISSN: 0353-8109 |
ispartof | Acta informatica medica, 2019-06, Vol.27 (2), p.96-102 |
issn | 0353-8109 1986-5988 |
language | eng |
recordid | cdi_pubmedcentral_primary_oai_pubmedcentral_nih_gov_6689381 |
source | EZB-FREE-00999 freely available EZB journals; PubMed Central |
subjects | Accuracy Artificial neural networks Augmentation Biometrics Convolution Datasets Deep learning Feature extraction Gender Identification methods Image classification Image segmentation Original Paper Surveillance systems Training |
title | Deep Iris: Deep Learning for Gender Classification Through Iris Patterns |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-26T14%3A41%3A02IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_pubme&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Deep%20Iris:%20Deep%20Learning%20for%20Gender%20Classification%20Through%20Iris%20Patterns&rft.jtitle=Acta%20informatica%20medica&rft.au=Khalifa,%20Nour%20Eldeen%20M&rft.date=2019-06-01&rft.volume=27&rft.issue=2&rft.spage=96&rft.epage=102&rft.pages=96-102&rft.issn=0353-8109&rft.eissn=1986-5988&rft_id=info:doi/10.5455/aim.2019.27.96-102&rft_dat=%3Cproquest_pubme%3E2281109499%3C/proquest_pubme%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2300612562&rft_id=info:pmid/31452566&rfr_iscdi=true |