A view-invariant gait recognition algorithm based on a joint-direct linear discriminant analysis

This paper proposes a view-invariant gait recognition algorithm, which builds a unique view invariant model taking advantage of the dimensionality reduction provided by the Direct Linear Discriminant Analysis (DLDA). Proposed scheme is able to reduce the under-sampling problem (USP) that appears usu...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Applied intelligence (Dordrecht, Netherlands) Netherlands), 2018-05, Vol.48 (5), p.1200-1217
Hauptverfasser: Portillo-Portillo, Jose, Leyva, Roberto, Sanchez, Victor, Sanchez-Perez, Gabriel, Perez-Meana, Hector, Olivares-Mercado, Jesus, Toscano-Medina, Karina, Nakano-Miyatake, Mariko
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 1217
container_issue 5
container_start_page 1200
container_title Applied intelligence (Dordrecht, Netherlands)
container_volume 48
creator Portillo-Portillo, Jose
Leyva, Roberto
Sanchez, Victor
Sanchez-Perez, Gabriel
Perez-Meana, Hector
Olivares-Mercado, Jesus
Toscano-Medina, Karina
Nakano-Miyatake, Mariko
description This paper proposes a view-invariant gait recognition algorithm, which builds a unique view invariant model taking advantage of the dimensionality reduction provided by the Direct Linear Discriminant Analysis (DLDA). Proposed scheme is able to reduce the under-sampling problem (USP) that appears usually when the number of training samples is much smaller than the dimension of the feature space. Proposed approach uses the Gait Energy Images (GEIs) and DLDA to create a view invariant model that is able to determine with high accuracy the identity of the person under analysis independently of incoming angles. Evaluation results show that the proposed scheme provides a recognition performance quite independent of the view angles and higher accuracy compared with other previously proposed gait recognition methods, in terms of computational complexity and recognition accuracy.
doi_str_mv 10.1007/s10489-017-1043-8
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2022493214</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2022493214</sourcerecordid><originalsourceid>FETCH-LOGICAL-c316t-7b7440bf8fffb66b1661aba31d25b6414ab70c5410f0e640ffc92312f6b1a9ec3</originalsourceid><addsrcrecordid>eNp1kE1LAzEQhoMoWKs_wFvAc3Ty0WT3WIpfUPCi4C1md5M1ZZutybbSf2-WFTx5mmF43pfhQeiawi0FUHeJgihKAlSRvHFSnKAZXShOlCjVKZpByQSRsnw_RxcpbQCAc6Az9LHEB2-_iQ8HE70JA26NH3C0dd8GP_g-YNO1ffTD5xZXJtkGjye86X0YSOMzOODOB2sibnyqo9_6MNaYYLpj8ukSnTnTJXv1O-fo7eH-dfVE1i-Pz6vlmtScyoGoSgkBlSucc5WUFZWSmspw2rBFJQUVplJQLwQFB1YKcK4uGafMZdSUtuZzdDP17mL_tbdp0Jt-H_MTSTNgTJScUZEpOlF17FOK1uld_tjEo6agR5F6EqmzSD2K1EXOsCmTMhtaG_-a_w_9AE7Pdx4</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2022493214</pqid></control><display><type>article</type><title>A view-invariant gait recognition algorithm based on a joint-direct linear discriminant analysis</title><source>SpringerNature Journals</source><creator>Portillo-Portillo, Jose ; Leyva, Roberto ; Sanchez, Victor ; Sanchez-Perez, Gabriel ; Perez-Meana, Hector ; Olivares-Mercado, Jesus ; Toscano-Medina, Karina ; Nakano-Miyatake, Mariko</creator><creatorcontrib>Portillo-Portillo, Jose ; Leyva, Roberto ; Sanchez, Victor ; Sanchez-Perez, Gabriel ; Perez-Meana, Hector ; Olivares-Mercado, Jesus ; Toscano-Medina, Karina ; Nakano-Miyatake, Mariko</creatorcontrib><description>This paper proposes a view-invariant gait recognition algorithm, which builds a unique view invariant model taking advantage of the dimensionality reduction provided by the Direct Linear Discriminant Analysis (DLDA). Proposed scheme is able to reduce the under-sampling problem (USP) that appears usually when the number of training samples is much smaller than the dimension of the feature space. Proposed approach uses the Gait Energy Images (GEIs) and DLDA to create a view invariant model that is able to determine with high accuracy the identity of the person under analysis independently of incoming angles. Evaluation results show that the proposed scheme provides a recognition performance quite independent of the view angles and higher accuracy compared with other previously proposed gait recognition methods, in terms of computational complexity and recognition accuracy.</description><identifier>ISSN: 0924-669X</identifier><identifier>EISSN: 1573-7497</identifier><identifier>DOI: 10.1007/s10489-017-1043-8</identifier><language>eng</language><publisher>New York: Springer US</publisher><subject>Accuracy ; Artificial Intelligence ; Computer Science ; Discriminant analysis ; Energy consumption ; Gait recognition ; Invariants ; Machines ; Manufacturing ; Mechanical Engineering ; Processes</subject><ispartof>Applied intelligence (Dordrecht, Netherlands), 2018-05, Vol.48 (5), p.1200-1217</ispartof><rights>Springer Science+Business Media, LLC 2017</rights><rights>Applied Intelligence is a copyright of Springer, (2017). All Rights Reserved.</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c316t-7b7440bf8fffb66b1661aba31d25b6414ab70c5410f0e640ffc92312f6b1a9ec3</citedby><cites>FETCH-LOGICAL-c316t-7b7440bf8fffb66b1661aba31d25b6414ab70c5410f0e640ffc92312f6b1a9ec3</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://link.springer.com/content/pdf/10.1007/s10489-017-1043-8$$EPDF$$P50$$Gspringer$$H</linktopdf><linktohtml>$$Uhttps://link.springer.com/10.1007/s10489-017-1043-8$$EHTML$$P50$$Gspringer$$H</linktohtml><link.rule.ids>315,781,785,27929,27930,41493,42562,51324</link.rule.ids></links><search><creatorcontrib>Portillo-Portillo, Jose</creatorcontrib><creatorcontrib>Leyva, Roberto</creatorcontrib><creatorcontrib>Sanchez, Victor</creatorcontrib><creatorcontrib>Sanchez-Perez, Gabriel</creatorcontrib><creatorcontrib>Perez-Meana, Hector</creatorcontrib><creatorcontrib>Olivares-Mercado, Jesus</creatorcontrib><creatorcontrib>Toscano-Medina, Karina</creatorcontrib><creatorcontrib>Nakano-Miyatake, Mariko</creatorcontrib><title>A view-invariant gait recognition algorithm based on a joint-direct linear discriminant analysis</title><title>Applied intelligence (Dordrecht, Netherlands)</title><addtitle>Appl Intell</addtitle><description>This paper proposes a view-invariant gait recognition algorithm, which builds a unique view invariant model taking advantage of the dimensionality reduction provided by the Direct Linear Discriminant Analysis (DLDA). Proposed scheme is able to reduce the under-sampling problem (USP) that appears usually when the number of training samples is much smaller than the dimension of the feature space. Proposed approach uses the Gait Energy Images (GEIs) and DLDA to create a view invariant model that is able to determine with high accuracy the identity of the person under analysis independently of incoming angles. Evaluation results show that the proposed scheme provides a recognition performance quite independent of the view angles and higher accuracy compared with other previously proposed gait recognition methods, in terms of computational complexity and recognition accuracy.</description><subject>Accuracy</subject><subject>Artificial Intelligence</subject><subject>Computer Science</subject><subject>Discriminant analysis</subject><subject>Energy consumption</subject><subject>Gait recognition</subject><subject>Invariants</subject><subject>Machines</subject><subject>Manufacturing</subject><subject>Mechanical Engineering</subject><subject>Processes</subject><issn>0924-669X</issn><issn>1573-7497</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2018</creationdate><recordtype>article</recordtype><sourceid>ABUWG</sourceid><sourceid>AFKRA</sourceid><sourceid>AZQEC</sourceid><sourceid>BENPR</sourceid><sourceid>CCPQU</sourceid><sourceid>DWQXO</sourceid><sourceid>GNUQQ</sourceid><recordid>eNp1kE1LAzEQhoMoWKs_wFvAc3Ty0WT3WIpfUPCi4C1md5M1ZZutybbSf2-WFTx5mmF43pfhQeiawi0FUHeJgihKAlSRvHFSnKAZXShOlCjVKZpByQSRsnw_RxcpbQCAc6Az9LHEB2-_iQ8HE70JA26NH3C0dd8GP_g-YNO1ffTD5xZXJtkGjye86X0YSOMzOODOB2sibnyqo9_6MNaYYLpj8ukSnTnTJXv1O-fo7eH-dfVE1i-Pz6vlmtScyoGoSgkBlSucc5WUFZWSmspw2rBFJQUVplJQLwQFB1YKcK4uGafMZdSUtuZzdDP17mL_tbdp0Jt-H_MTSTNgTJScUZEpOlF17FOK1uld_tjEo6agR5F6EqmzSD2K1EXOsCmTMhtaG_-a_w_9AE7Pdx4</recordid><startdate>20180501</startdate><enddate>20180501</enddate><creator>Portillo-Portillo, Jose</creator><creator>Leyva, Roberto</creator><creator>Sanchez, Victor</creator><creator>Sanchez-Perez, Gabriel</creator><creator>Perez-Meana, Hector</creator><creator>Olivares-Mercado, Jesus</creator><creator>Toscano-Medina, Karina</creator><creator>Nakano-Miyatake, Mariko</creator><general>Springer US</general><general>Springer Nature B.V</general><scope>AAYXX</scope><scope>CITATION</scope><scope>3V.</scope><scope>7SC</scope><scope>7WY</scope><scope>7WZ</scope><scope>7XB</scope><scope>87Z</scope><scope>8AL</scope><scope>8FD</scope><scope>8FE</scope><scope>8FG</scope><scope>8FK</scope><scope>8FL</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BEZIV</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>FRNLG</scope><scope>F~G</scope><scope>GNUQQ</scope><scope>HCIFZ</scope><scope>JQ2</scope><scope>K60</scope><scope>K6~</scope><scope>K7-</scope><scope>L.-</scope><scope>L6V</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>M0C</scope><scope>M0N</scope><scope>M7S</scope><scope>P5Z</scope><scope>P62</scope><scope>PQBIZ</scope><scope>PQBZA</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PSYQQ</scope><scope>PTHSS</scope><scope>Q9U</scope></search><sort><creationdate>20180501</creationdate><title>A view-invariant gait recognition algorithm based on a joint-direct linear discriminant analysis</title><author>Portillo-Portillo, Jose ; Leyva, Roberto ; Sanchez, Victor ; Sanchez-Perez, Gabriel ; Perez-Meana, Hector ; Olivares-Mercado, Jesus ; Toscano-Medina, Karina ; Nakano-Miyatake, Mariko</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c316t-7b7440bf8fffb66b1661aba31d25b6414ab70c5410f0e640ffc92312f6b1a9ec3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2018</creationdate><topic>Accuracy</topic><topic>Artificial Intelligence</topic><topic>Computer Science</topic><topic>Discriminant analysis</topic><topic>Energy consumption</topic><topic>Gait recognition</topic><topic>Invariants</topic><topic>Machines</topic><topic>Manufacturing</topic><topic>Mechanical Engineering</topic><topic>Processes</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Portillo-Portillo, Jose</creatorcontrib><creatorcontrib>Leyva, Roberto</creatorcontrib><creatorcontrib>Sanchez, Victor</creatorcontrib><creatorcontrib>Sanchez-Perez, Gabriel</creatorcontrib><creatorcontrib>Perez-Meana, Hector</creatorcontrib><creatorcontrib>Olivares-Mercado, Jesus</creatorcontrib><creatorcontrib>Toscano-Medina, Karina</creatorcontrib><creatorcontrib>Nakano-Miyatake, Mariko</creatorcontrib><collection>CrossRef</collection><collection>ProQuest Central (Corporate)</collection><collection>Computer and Information Systems Abstracts</collection><collection>Access via ABI/INFORM (ProQuest)</collection><collection>ABI/INFORM Global (PDF only)</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>ABI/INFORM Global (Alumni Edition)</collection><collection>Computing Database (Alumni Edition)</collection><collection>Technology Research Database</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>ABI/INFORM Collection (Alumni Edition)</collection><collection>Materials Science &amp; Engineering Collection</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>Advanced Technologies &amp; Aerospace Collection</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Business Premium Collection</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>Business Premium Collection (Alumni)</collection><collection>ABI/INFORM Global (Corporate)</collection><collection>ProQuest Central Student</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Computer Science Collection</collection><collection>ProQuest Business Collection (Alumni Edition)</collection><collection>ProQuest Business Collection</collection><collection>Computer Science Database</collection><collection>ABI/INFORM Professional Advanced</collection><collection>ProQuest Engineering Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>ABI/INFORM Global</collection><collection>Computing Database</collection><collection>Engineering Database</collection><collection>Advanced Technologies &amp; Aerospace Database</collection><collection>ProQuest Advanced Technologies &amp; Aerospace Collection</collection><collection>ProQuest One Business</collection><collection>ProQuest One Business (Alumni)</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>ProQuest One Psychology</collection><collection>Engineering Collection</collection><collection>ProQuest Central Basic</collection><jtitle>Applied intelligence (Dordrecht, Netherlands)</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Portillo-Portillo, Jose</au><au>Leyva, Roberto</au><au>Sanchez, Victor</au><au>Sanchez-Perez, Gabriel</au><au>Perez-Meana, Hector</au><au>Olivares-Mercado, Jesus</au><au>Toscano-Medina, Karina</au><au>Nakano-Miyatake, Mariko</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>A view-invariant gait recognition algorithm based on a joint-direct linear discriminant analysis</atitle><jtitle>Applied intelligence (Dordrecht, Netherlands)</jtitle><stitle>Appl Intell</stitle><date>2018-05-01</date><risdate>2018</risdate><volume>48</volume><issue>5</issue><spage>1200</spage><epage>1217</epage><pages>1200-1217</pages><issn>0924-669X</issn><eissn>1573-7497</eissn><abstract>This paper proposes a view-invariant gait recognition algorithm, which builds a unique view invariant model taking advantage of the dimensionality reduction provided by the Direct Linear Discriminant Analysis (DLDA). Proposed scheme is able to reduce the under-sampling problem (USP) that appears usually when the number of training samples is much smaller than the dimension of the feature space. Proposed approach uses the Gait Energy Images (GEIs) and DLDA to create a view invariant model that is able to determine with high accuracy the identity of the person under analysis independently of incoming angles. Evaluation results show that the proposed scheme provides a recognition performance quite independent of the view angles and higher accuracy compared with other previously proposed gait recognition methods, in terms of computational complexity and recognition accuracy.</abstract><cop>New York</cop><pub>Springer US</pub><doi>10.1007/s10489-017-1043-8</doi><tpages>18</tpages></addata></record>
fulltext fulltext
identifier ISSN: 0924-669X
ispartof Applied intelligence (Dordrecht, Netherlands), 2018-05, Vol.48 (5), p.1200-1217
issn 0924-669X
1573-7497
language eng
recordid cdi_proquest_journals_2022493214
source SpringerNature Journals
subjects Accuracy
Artificial Intelligence
Computer Science
Discriminant analysis
Energy consumption
Gait recognition
Invariants
Machines
Manufacturing
Mechanical Engineering
Processes
title A view-invariant gait recognition algorithm based on a joint-direct linear discriminant analysis
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-16T07%3A38%3A21IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=A%20view-invariant%20gait%20recognition%20algorithm%20based%20on%20a%20joint-direct%20linear%20discriminant%20analysis&rft.jtitle=Applied%20intelligence%20(Dordrecht,%20Netherlands)&rft.au=Portillo-Portillo,%20Jose&rft.date=2018-05-01&rft.volume=48&rft.issue=5&rft.spage=1200&rft.epage=1217&rft.pages=1200-1217&rft.issn=0924-669X&rft.eissn=1573-7497&rft_id=info:doi/10.1007/s10489-017-1043-8&rft_dat=%3Cproquest_cross%3E2022493214%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2022493214&rft_id=info:pmid/&rfr_iscdi=true