OCT fingerprint section image authenticity detection method based on reconstruction difference

The invention discloses an OCT fingerprint section image authenticity detection method based on reconstruction difference, and the method comprises the steps: S1, constructing a full convolutional neural network model which comprises an encoder, a generator and a feature extractor; s2, collecting im...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: ZHU CHENGFANG, ZHANG YILONG, LIANG RONGHUA, CHEN PENG, WANG HAIXIA
Format: Patent
Sprache:chi ; eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title
container_volume
creator ZHU CHENGFANG
ZHANG YILONG
LIANG RONGHUA
CHEN PENG
WANG HAIXIA
description The invention discloses an OCT fingerprint section image authenticity detection method based on reconstruction difference, and the method comprises the steps: S1, constructing a full convolutional neural network model which comprises an encoder, a generator and a feature extractor; s2, collecting images collected by an OCT system, and after preprocessing is completed, randomly selecting 70% of positive sample images as training data; selecting the other 30% of positive sample images and negative sample images, and taking the images as test data after quantity equalization; s3, training a network model; selecting the divided training image as input data, and setting a loss function for optimizing an encoder and a generator; setting comparison loss for optimizing the feature extractor; performing multi-round training on the established network model, updating and optimizing model weight parameters through back propagation until a loss function tends to converge, and stopping training; s4, testing the network mo
format Patent
fullrecord <record><control><sourceid>epo_EVB</sourceid><recordid>TN_cdi_epo_espacenet_CN114581963A</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>CN114581963A</sourcerecordid><originalsourceid>FETCH-epo_espacenet_CN114581963A3</originalsourceid><addsrcrecordid>eNqNirEKwkAQBdNYiPoP6wdYhKhoKUGx0ia14bx7lyyYveNuU_j3CuYDrIZhZl487nVDnqVDiolFKcMqByEeTAcyo_YQZcv6Jged4gDtg6OnyXD09QQbJGsaf9mx90gQi2Ux8-aVsZq4KNaXc1NfN4ihRY7GQqBtfSvL7e5QHvfVqfrn-QCOkz1b</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>patent</recordtype></control><display><type>patent</type><title>OCT fingerprint section image authenticity detection method based on reconstruction difference</title><source>esp@cenet</source><creator>ZHU CHENGFANG ; ZHANG YILONG ; LIANG RONGHUA ; CHEN PENG ; WANG HAIXIA</creator><creatorcontrib>ZHU CHENGFANG ; ZHANG YILONG ; LIANG RONGHUA ; CHEN PENG ; WANG HAIXIA</creatorcontrib><description>The invention discloses an OCT fingerprint section image authenticity detection method based on reconstruction difference, and the method comprises the steps: S1, constructing a full convolutional neural network model which comprises an encoder, a generator and a feature extractor; s2, collecting images collected by an OCT system, and after preprocessing is completed, randomly selecting 70% of positive sample images as training data; selecting the other 30% of positive sample images and negative sample images, and taking the images as test data after quantity equalization; s3, training a network model; selecting the divided training image as input data, and setting a loss function for optimizing an encoder and a generator; setting comparison loss for optimizing the feature extractor; performing multi-round training on the established network model, updating and optimizing model weight parameters through back propagation until a loss function tends to converge, and stopping training; s4, testing the network mo</description><language>chi ; eng</language><subject>CALCULATING ; COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS ; COMPUTING ; COUNTING ; PHYSICS</subject><creationdate>2022</creationdate><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://worldwide.espacenet.com/publicationDetails/biblio?FT=D&amp;date=20220603&amp;DB=EPODOC&amp;CC=CN&amp;NR=114581963A$$EHTML$$P50$$Gepo$$Hfree_for_read</linktohtml><link.rule.ids>230,308,780,885,25564,76547</link.rule.ids><linktorsrc>$$Uhttps://worldwide.espacenet.com/publicationDetails/biblio?FT=D&amp;date=20220603&amp;DB=EPODOC&amp;CC=CN&amp;NR=114581963A$$EView_record_in_European_Patent_Office$$FView_record_in_$$GEuropean_Patent_Office$$Hfree_for_read</linktorsrc></links><search><creatorcontrib>ZHU CHENGFANG</creatorcontrib><creatorcontrib>ZHANG YILONG</creatorcontrib><creatorcontrib>LIANG RONGHUA</creatorcontrib><creatorcontrib>CHEN PENG</creatorcontrib><creatorcontrib>WANG HAIXIA</creatorcontrib><title>OCT fingerprint section image authenticity detection method based on reconstruction difference</title><description>The invention discloses an OCT fingerprint section image authenticity detection method based on reconstruction difference, and the method comprises the steps: S1, constructing a full convolutional neural network model which comprises an encoder, a generator and a feature extractor; s2, collecting images collected by an OCT system, and after preprocessing is completed, randomly selecting 70% of positive sample images as training data; selecting the other 30% of positive sample images and negative sample images, and taking the images as test data after quantity equalization; s3, training a network model; selecting the divided training image as input data, and setting a loss function for optimizing an encoder and a generator; setting comparison loss for optimizing the feature extractor; performing multi-round training on the established network model, updating and optimizing model weight parameters through back propagation until a loss function tends to converge, and stopping training; s4, testing the network mo</description><subject>CALCULATING</subject><subject>COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS</subject><subject>COMPUTING</subject><subject>COUNTING</subject><subject>PHYSICS</subject><fulltext>true</fulltext><rsrctype>patent</rsrctype><creationdate>2022</creationdate><recordtype>patent</recordtype><sourceid>EVB</sourceid><recordid>eNqNirEKwkAQBdNYiPoP6wdYhKhoKUGx0ia14bx7lyyYveNuU_j3CuYDrIZhZl487nVDnqVDiolFKcMqByEeTAcyo_YQZcv6Jged4gDtg6OnyXD09QQbJGsaf9mx90gQi2Ux8-aVsZq4KNaXc1NfN4ihRY7GQqBtfSvL7e5QHvfVqfrn-QCOkz1b</recordid><startdate>20220603</startdate><enddate>20220603</enddate><creator>ZHU CHENGFANG</creator><creator>ZHANG YILONG</creator><creator>LIANG RONGHUA</creator><creator>CHEN PENG</creator><creator>WANG HAIXIA</creator><scope>EVB</scope></search><sort><creationdate>20220603</creationdate><title>OCT fingerprint section image authenticity detection method based on reconstruction difference</title><author>ZHU CHENGFANG ; ZHANG YILONG ; LIANG RONGHUA ; CHEN PENG ; WANG HAIXIA</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-epo_espacenet_CN114581963A3</frbrgroupid><rsrctype>patents</rsrctype><prefilter>patents</prefilter><language>chi ; eng</language><creationdate>2022</creationdate><topic>CALCULATING</topic><topic>COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS</topic><topic>COMPUTING</topic><topic>COUNTING</topic><topic>PHYSICS</topic><toplevel>online_resources</toplevel><creatorcontrib>ZHU CHENGFANG</creatorcontrib><creatorcontrib>ZHANG YILONG</creatorcontrib><creatorcontrib>LIANG RONGHUA</creatorcontrib><creatorcontrib>CHEN PENG</creatorcontrib><creatorcontrib>WANG HAIXIA</creatorcontrib><collection>esp@cenet</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>ZHU CHENGFANG</au><au>ZHANG YILONG</au><au>LIANG RONGHUA</au><au>CHEN PENG</au><au>WANG HAIXIA</au><format>patent</format><genre>patent</genre><ristype>GEN</ristype><title>OCT fingerprint section image authenticity detection method based on reconstruction difference</title><date>2022-06-03</date><risdate>2022</risdate><abstract>The invention discloses an OCT fingerprint section image authenticity detection method based on reconstruction difference, and the method comprises the steps: S1, constructing a full convolutional neural network model which comprises an encoder, a generator and a feature extractor; s2, collecting images collected by an OCT system, and after preprocessing is completed, randomly selecting 70% of positive sample images as training data; selecting the other 30% of positive sample images and negative sample images, and taking the images as test data after quantity equalization; s3, training a network model; selecting the divided training image as input data, and setting a loss function for optimizing an encoder and a generator; setting comparison loss for optimizing the feature extractor; performing multi-round training on the established network model, updating and optimizing model weight parameters through back propagation until a loss function tends to converge, and stopping training; s4, testing the network mo</abstract><oa>free_for_read</oa></addata></record>
fulltext fulltext_linktorsrc
identifier
ispartof
issn
language chi ; eng
recordid cdi_epo_espacenet_CN114581963A
source esp@cenet
subjects CALCULATING
COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
COMPUTING
COUNTING
PHYSICS
title OCT fingerprint section image authenticity detection method based on reconstruction difference
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-27T13%3A58%3A37IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-epo_EVB&rft_val_fmt=info:ofi/fmt:kev:mtx:patent&rft.genre=patent&rft.au=ZHU%20CHENGFANG&rft.date=2022-06-03&rft_id=info:doi/&rft_dat=%3Cepo_EVB%3ECN114581963A%3C/epo_EVB%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true