A real-time high precision eye center localizer
Precise eye center localization remains a very promising but challenging task, while its real-time performance constitutes a critical constraint in many human interaction applications. In this paper a new hybrid framework that combines the shape-based Modified Fast Radial Symmetry Transform (MFRST)...
Gespeichert in:
Veröffentlicht in: | Journal of real-time image processing 2022-04, Vol.19 (2), p.475-486 |
---|---|
Hauptverfasser: | , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 486 |
---|---|
container_issue | 2 |
container_start_page | 475 |
container_title | Journal of real-time image processing |
container_volume | 19 |
creator | Poulopoulos, Nikolaos Psarakis, Emmanouil Z. |
description | Precise eye center localization remains a very promising but challenging task, while its real-time performance constitutes a critical constraint in many human interaction applications. In this paper a new hybrid framework that combines the shape-based Modified Fast Radial Symmetry Transform (MFRST) and a Convolutional Neural Network (CNN), is introduced. The motivation of this work is to exploit the circularity of the iris to reduce the search space and consequently, the computational complexity of the fed CNN. Thus, the proposed hybrid scheme not only achieves real-time performance, but also increases substantially the localization accuracy by reducing the false detections of the MFRST. The experimental results that stemmed from the most challenging face databases demonstrated high accuracy, outperforming state of the art techniques even those that are based on end-to-end deep neural networks. To deal with unreliable data and provide valid evaluation, we manually annotated the FERET database, making the annotations publicly available. Moreover, the reduced computational time of the proposed scheme reveals that it can be incorporated in low-cost eye trackers, where the real-time performance is a basic prerequisite. |
doi_str_mv | 10.1007/s11554-022-01200-8 |
format | Article |
fullrecord | <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2918677421</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2918677421</sourcerecordid><originalsourceid>FETCH-LOGICAL-c249t-cc30f6fcd33df283a55de0202ec2e28daf87b232e3ef1e300c3db4901af5c89d3</originalsourceid><addsrcrecordid>eNp9kLFOwzAQhi0EEqXwAkyRmE3P5zhxxqoCilSJBWbLdc6tqzQpdjqUpycQBBvT3fB__-k-xm4F3AuAcpaEUCrngMhBIADXZ2widCG4RlGd_-4Al-wqpR1AURZSTdhsnkWyDe_DnrJt2GyzQyQXUujajE6UOWp7ilnTOduED4rX7MLbJtHNz5yyt8eH18WSr16enhfzFXeYVz13ToIvvKulrD1qaZWqCRCQHBLq2npdrlEiSfKCJICT9TqvQFivnK5qOWV3Y-8hdu9HSr3ZdcfYDicNVsMzZZmjGFI4plzsUorkzSGGvY0nI8B8iTGjGDOIMd9ijB4gOUJpCLcbin_V_1CfkhdlAA</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2918677421</pqid></control><display><type>article</type><title>A real-time high precision eye center localizer</title><source>ProQuest Central UK/Ireland</source><source>SpringerLink Journals - AutoHoldings</source><source>ProQuest Central</source><creator>Poulopoulos, Nikolaos ; Psarakis, Emmanouil Z.</creator><creatorcontrib>Poulopoulos, Nikolaos ; Psarakis, Emmanouil Z.</creatorcontrib><description>Precise eye center localization remains a very promising but challenging task, while its real-time performance constitutes a critical constraint in many human interaction applications. In this paper a new hybrid framework that combines the shape-based Modified Fast Radial Symmetry Transform (MFRST) and a Convolutional Neural Network (CNN), is introduced. The motivation of this work is to exploit the circularity of the iris to reduce the search space and consequently, the computational complexity of the fed CNN. Thus, the proposed hybrid scheme not only achieves real-time performance, but also increases substantially the localization accuracy by reducing the false detections of the MFRST. The experimental results that stemmed from the most challenging face databases demonstrated high accuracy, outperforming state of the art techniques even those that are based on end-to-end deep neural networks. To deal with unreliable data and provide valid evaluation, we manually annotated the FERET database, making the annotations publicly available. Moreover, the reduced computational time of the proposed scheme reveals that it can be incorporated in low-cost eye trackers, where the real-time performance is a basic prerequisite.</description><identifier>ISSN: 1861-8200</identifier><identifier>EISSN: 1861-8219</identifier><identifier>DOI: 10.1007/s11554-022-01200-8</identifier><language>eng</language><publisher>Berlin/Heidelberg: Springer Berlin Heidelberg</publisher><subject>Accuracy ; Algorithms ; Annotations ; Artificial neural networks ; Computer Graphics ; Computer Science ; Computing time ; Eye (anatomy) ; Image Processing and Computer Vision ; Localization ; Machine learning ; Methods ; Multimedia Information Systems ; Neural networks ; Original Research Paper ; Pattern Recognition ; Real time ; Signal,Image and Speech Processing ; Symmetry</subject><ispartof>Journal of real-time image processing, 2022-04, Vol.19 (2), p.475-486</ispartof><rights>The Author(s), under exclusive licence to Springer-Verlag GmbH Germany, part of Springer Nature 2022</rights><rights>The Author(s), under exclusive licence to Springer-Verlag GmbH Germany, part of Springer Nature 2022.</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c249t-cc30f6fcd33df283a55de0202ec2e28daf87b232e3ef1e300c3db4901af5c89d3</citedby><cites>FETCH-LOGICAL-c249t-cc30f6fcd33df283a55de0202ec2e28daf87b232e3ef1e300c3db4901af5c89d3</cites><orcidid>0000-0002-9627-0640</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://link.springer.com/content/pdf/10.1007/s11554-022-01200-8$$EPDF$$P50$$Gspringer$$H</linktopdf><linktohtml>$$Uhttps://www.proquest.com/docview/2918677421?pq-origsite=primo$$EHTML$$P50$$Gproquest$$H</linktohtml><link.rule.ids>314,780,784,21388,27924,27925,33744,41488,42557,43805,51319,64385,64389,72341</link.rule.ids></links><search><creatorcontrib>Poulopoulos, Nikolaos</creatorcontrib><creatorcontrib>Psarakis, Emmanouil Z.</creatorcontrib><title>A real-time high precision eye center localizer</title><title>Journal of real-time image processing</title><addtitle>J Real-Time Image Proc</addtitle><description>Precise eye center localization remains a very promising but challenging task, while its real-time performance constitutes a critical constraint in many human interaction applications. In this paper a new hybrid framework that combines the shape-based Modified Fast Radial Symmetry Transform (MFRST) and a Convolutional Neural Network (CNN), is introduced. The motivation of this work is to exploit the circularity of the iris to reduce the search space and consequently, the computational complexity of the fed CNN. Thus, the proposed hybrid scheme not only achieves real-time performance, but also increases substantially the localization accuracy by reducing the false detections of the MFRST. The experimental results that stemmed from the most challenging face databases demonstrated high accuracy, outperforming state of the art techniques even those that are based on end-to-end deep neural networks. To deal with unreliable data and provide valid evaluation, we manually annotated the FERET database, making the annotations publicly available. Moreover, the reduced computational time of the proposed scheme reveals that it can be incorporated in low-cost eye trackers, where the real-time performance is a basic prerequisite.</description><subject>Accuracy</subject><subject>Algorithms</subject><subject>Annotations</subject><subject>Artificial neural networks</subject><subject>Computer Graphics</subject><subject>Computer Science</subject><subject>Computing time</subject><subject>Eye (anatomy)</subject><subject>Image Processing and Computer Vision</subject><subject>Localization</subject><subject>Machine learning</subject><subject>Methods</subject><subject>Multimedia Information Systems</subject><subject>Neural networks</subject><subject>Original Research Paper</subject><subject>Pattern Recognition</subject><subject>Real time</subject><subject>Signal,Image and Speech Processing</subject><subject>Symmetry</subject><issn>1861-8200</issn><issn>1861-8219</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2022</creationdate><recordtype>article</recordtype><sourceid>AFKRA</sourceid><sourceid>AZQEC</sourceid><sourceid>BENPR</sourceid><sourceid>CCPQU</sourceid><sourceid>DWQXO</sourceid><sourceid>GNUQQ</sourceid><recordid>eNp9kLFOwzAQhi0EEqXwAkyRmE3P5zhxxqoCilSJBWbLdc6tqzQpdjqUpycQBBvT3fB__-k-xm4F3AuAcpaEUCrngMhBIADXZ2widCG4RlGd_-4Al-wqpR1AURZSTdhsnkWyDe_DnrJt2GyzQyQXUujajE6UOWp7ilnTOduED4rX7MLbJtHNz5yyt8eH18WSr16enhfzFXeYVz13ToIvvKulrD1qaZWqCRCQHBLq2npdrlEiSfKCJICT9TqvQFivnK5qOWV3Y-8hdu9HSr3ZdcfYDicNVsMzZZmjGFI4plzsUorkzSGGvY0nI8B8iTGjGDOIMd9ijB4gOUJpCLcbin_V_1CfkhdlAA</recordid><startdate>20220401</startdate><enddate>20220401</enddate><creator>Poulopoulos, Nikolaos</creator><creator>Psarakis, Emmanouil Z.</creator><general>Springer Berlin Heidelberg</general><general>Springer Nature B.V</general><scope>AAYXX</scope><scope>CITATION</scope><scope>8FE</scope><scope>8FG</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>GNUQQ</scope><scope>HCIFZ</scope><scope>JQ2</scope><scope>K7-</scope><scope>P5Z</scope><scope>P62</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><orcidid>https://orcid.org/0000-0002-9627-0640</orcidid></search><sort><creationdate>20220401</creationdate><title>A real-time high precision eye center localizer</title><author>Poulopoulos, Nikolaos ; Psarakis, Emmanouil Z.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c249t-cc30f6fcd33df283a55de0202ec2e28daf87b232e3ef1e300c3db4901af5c89d3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2022</creationdate><topic>Accuracy</topic><topic>Algorithms</topic><topic>Annotations</topic><topic>Artificial neural networks</topic><topic>Computer Graphics</topic><topic>Computer Science</topic><topic>Computing time</topic><topic>Eye (anatomy)</topic><topic>Image Processing and Computer Vision</topic><topic>Localization</topic><topic>Machine learning</topic><topic>Methods</topic><topic>Multimedia Information Systems</topic><topic>Neural networks</topic><topic>Original Research Paper</topic><topic>Pattern Recognition</topic><topic>Real time</topic><topic>Signal,Image and Speech Processing</topic><topic>Symmetry</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Poulopoulos, Nikolaos</creatorcontrib><creatorcontrib>Psarakis, Emmanouil Z.</creatorcontrib><collection>CrossRef</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>ProQuest Central UK/Ireland</collection><collection>Advanced Technologies & Aerospace Collection</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>ProQuest Central Student</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Computer Science Collection</collection><collection>Computer Science Database</collection><collection>Advanced Technologies & Aerospace Database</collection><collection>ProQuest Advanced Technologies & Aerospace Collection</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><jtitle>Journal of real-time image processing</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Poulopoulos, Nikolaos</au><au>Psarakis, Emmanouil Z.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>A real-time high precision eye center localizer</atitle><jtitle>Journal of real-time image processing</jtitle><stitle>J Real-Time Image Proc</stitle><date>2022-04-01</date><risdate>2022</risdate><volume>19</volume><issue>2</issue><spage>475</spage><epage>486</epage><pages>475-486</pages><issn>1861-8200</issn><eissn>1861-8219</eissn><abstract>Precise eye center localization remains a very promising but challenging task, while its real-time performance constitutes a critical constraint in many human interaction applications. In this paper a new hybrid framework that combines the shape-based Modified Fast Radial Symmetry Transform (MFRST) and a Convolutional Neural Network (CNN), is introduced. The motivation of this work is to exploit the circularity of the iris to reduce the search space and consequently, the computational complexity of the fed CNN. Thus, the proposed hybrid scheme not only achieves real-time performance, but also increases substantially the localization accuracy by reducing the false detections of the MFRST. The experimental results that stemmed from the most challenging face databases demonstrated high accuracy, outperforming state of the art techniques even those that are based on end-to-end deep neural networks. To deal with unreliable data and provide valid evaluation, we manually annotated the FERET database, making the annotations publicly available. Moreover, the reduced computational time of the proposed scheme reveals that it can be incorporated in low-cost eye trackers, where the real-time performance is a basic prerequisite.</abstract><cop>Berlin/Heidelberg</cop><pub>Springer Berlin Heidelberg</pub><doi>10.1007/s11554-022-01200-8</doi><tpages>12</tpages><orcidid>https://orcid.org/0000-0002-9627-0640</orcidid></addata></record> |
fulltext | fulltext |
identifier | ISSN: 1861-8200 |
ispartof | Journal of real-time image processing, 2022-04, Vol.19 (2), p.475-486 |
issn | 1861-8200 1861-8219 |
language | eng |
recordid | cdi_proquest_journals_2918677421 |
source | ProQuest Central UK/Ireland; SpringerLink Journals - AutoHoldings; ProQuest Central |
subjects | Accuracy Algorithms Annotations Artificial neural networks Computer Graphics Computer Science Computing time Eye (anatomy) Image Processing and Computer Vision Localization Machine learning Methods Multimedia Information Systems Neural networks Original Research Paper Pattern Recognition Real time Signal,Image and Speech Processing Symmetry |
title | A real-time high precision eye center localizer |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-08T00%3A37%3A58IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=A%20real-time%20high%20precision%20eye%20center%20localizer&rft.jtitle=Journal%20of%20real-time%20image%20processing&rft.au=Poulopoulos,%20Nikolaos&rft.date=2022-04-01&rft.volume=19&rft.issue=2&rft.spage=475&rft.epage=486&rft.pages=475-486&rft.issn=1861-8200&rft.eissn=1861-8219&rft_id=info:doi/10.1007/s11554-022-01200-8&rft_dat=%3Cproquest_cross%3E2918677421%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2918677421&rft_id=info:pmid/&rfr_iscdi=true |