FORGETTING DATA SAMPLES FROM PRETRAINED NEURAL NETWORK MODELS
A method for forgetting data samples from a pretrained neural network (NN) model is provided. The method includes training an adversarial model to classify training data samples as members of the NN model and test data samples as non-members of the NN model. The method includes performing the follow...
Gespeichert in:
Hauptverfasser: | , , |
---|---|
Format: | Patent |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | |
---|---|
container_issue | |
container_start_page | |
container_title | |
container_volume | |
creator | FARKASH, Ariel SHMELKIN, Ron GOLDSTEEN, Abigail |
description | A method for forgetting data samples from a pretrained neural network (NN) model is provided. The method includes training an adversarial model to classify training data samples as members of the NN model and test data samples as non-members of the NN model. The method includes performing the following iteratively until the NN model has forgotten a specified threshold of data samples to be forgotten: (1) classifying the data samples as members or non-members using the trained adversarial model; (2) for the member data samples, determining a subset that includes data samples to be forgotten; (3) labeling the data samples within the subset as non-members and updating the NN model based on weight update techniques that cause the NN model to forget the data samples; (4) retraining the NN model without the data samples that have been forgotten; and (5) retraining the adversarial model for the next iteration. |
format | Patent |
fullrecord | <record><control><sourceid>epo_EVB</sourceid><recordid>TN_cdi_epo_espacenet_US2022300822A1</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>US2022300822A1</sourcerecordid><originalsourceid>FETCH-epo_espacenet_US2022300822A13</originalsourceid><addsrcrecordid>eNrjZLB18w9ydw0J8fRzV3BxDHFUCHb0DfBxDVZwC_L3VQgIcg0JcvT0c3VR8HMNDXL0AVIh4f5B3gq-_i6uPsE8DKxpiTnFqbxQmptB2c01xNlDN7UgPz61uCAxOTUvtSQ-NNjIwMjI2MDAwsjI0dCYOFUA6ssq4Q</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>patent</recordtype></control><display><type>patent</type><title>FORGETTING DATA SAMPLES FROM PRETRAINED NEURAL NETWORK MODELS</title><source>esp@cenet</source><creator>FARKASH, Ariel ; SHMELKIN, Ron ; GOLDSTEEN, Abigail</creator><creatorcontrib>FARKASH, Ariel ; SHMELKIN, Ron ; GOLDSTEEN, Abigail</creatorcontrib><description>A method for forgetting data samples from a pretrained neural network (NN) model is provided. The method includes training an adversarial model to classify training data samples as members of the NN model and test data samples as non-members of the NN model. The method includes performing the following iteratively until the NN model has forgotten a specified threshold of data samples to be forgotten: (1) classifying the data samples as members or non-members using the trained adversarial model; (2) for the member data samples, determining a subset that includes data samples to be forgotten; (3) labeling the data samples within the subset as non-members and updating the NN model based on weight update techniques that cause the NN model to forget the data samples; (4) retraining the NN model without the data samples that have been forgotten; and (5) retraining the adversarial model for the next iteration.</description><language>eng</language><subject>CALCULATING ; COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS ; COMPUTING ; COUNTING ; PHYSICS</subject><creationdate>2022</creationdate><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://worldwide.espacenet.com/publicationDetails/biblio?FT=D&date=20220922&DB=EPODOC&CC=US&NR=2022300822A1$$EHTML$$P50$$Gepo$$Hfree_for_read</linktohtml><link.rule.ids>230,308,777,882,25545,76296</link.rule.ids><linktorsrc>$$Uhttps://worldwide.espacenet.com/publicationDetails/biblio?FT=D&date=20220922&DB=EPODOC&CC=US&NR=2022300822A1$$EView_record_in_European_Patent_Office$$FView_record_in_$$GEuropean_Patent_Office$$Hfree_for_read</linktorsrc></links><search><creatorcontrib>FARKASH, Ariel</creatorcontrib><creatorcontrib>SHMELKIN, Ron</creatorcontrib><creatorcontrib>GOLDSTEEN, Abigail</creatorcontrib><title>FORGETTING DATA SAMPLES FROM PRETRAINED NEURAL NETWORK MODELS</title><description>A method for forgetting data samples from a pretrained neural network (NN) model is provided. The method includes training an adversarial model to classify training data samples as members of the NN model and test data samples as non-members of the NN model. The method includes performing the following iteratively until the NN model has forgotten a specified threshold of data samples to be forgotten: (1) classifying the data samples as members or non-members using the trained adversarial model; (2) for the member data samples, determining a subset that includes data samples to be forgotten; (3) labeling the data samples within the subset as non-members and updating the NN model based on weight update techniques that cause the NN model to forget the data samples; (4) retraining the NN model without the data samples that have been forgotten; and (5) retraining the adversarial model for the next iteration.</description><subject>CALCULATING</subject><subject>COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS</subject><subject>COMPUTING</subject><subject>COUNTING</subject><subject>PHYSICS</subject><fulltext>true</fulltext><rsrctype>patent</rsrctype><creationdate>2022</creationdate><recordtype>patent</recordtype><sourceid>EVB</sourceid><recordid>eNrjZLB18w9ydw0J8fRzV3BxDHFUCHb0DfBxDVZwC_L3VQgIcg0JcvT0c3VR8HMNDXL0AVIh4f5B3gq-_i6uPsE8DKxpiTnFqbxQmptB2c01xNlDN7UgPz61uCAxOTUvtSQ-NNjIwMjI2MDAwsjI0dCYOFUA6ssq4Q</recordid><startdate>20220922</startdate><enddate>20220922</enddate><creator>FARKASH, Ariel</creator><creator>SHMELKIN, Ron</creator><creator>GOLDSTEEN, Abigail</creator><scope>EVB</scope></search><sort><creationdate>20220922</creationdate><title>FORGETTING DATA SAMPLES FROM PRETRAINED NEURAL NETWORK MODELS</title><author>FARKASH, Ariel ; SHMELKIN, Ron ; GOLDSTEEN, Abigail</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-epo_espacenet_US2022300822A13</frbrgroupid><rsrctype>patents</rsrctype><prefilter>patents</prefilter><language>eng</language><creationdate>2022</creationdate><topic>CALCULATING</topic><topic>COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS</topic><topic>COMPUTING</topic><topic>COUNTING</topic><topic>PHYSICS</topic><toplevel>online_resources</toplevel><creatorcontrib>FARKASH, Ariel</creatorcontrib><creatorcontrib>SHMELKIN, Ron</creatorcontrib><creatorcontrib>GOLDSTEEN, Abigail</creatorcontrib><collection>esp@cenet</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>FARKASH, Ariel</au><au>SHMELKIN, Ron</au><au>GOLDSTEEN, Abigail</au><format>patent</format><genre>patent</genre><ristype>GEN</ristype><title>FORGETTING DATA SAMPLES FROM PRETRAINED NEURAL NETWORK MODELS</title><date>2022-09-22</date><risdate>2022</risdate><abstract>A method for forgetting data samples from a pretrained neural network (NN) model is provided. The method includes training an adversarial model to classify training data samples as members of the NN model and test data samples as non-members of the NN model. The method includes performing the following iteratively until the NN model has forgotten a specified threshold of data samples to be forgotten: (1) classifying the data samples as members or non-members using the trained adversarial model; (2) for the member data samples, determining a subset that includes data samples to be forgotten; (3) labeling the data samples within the subset as non-members and updating the NN model based on weight update techniques that cause the NN model to forget the data samples; (4) retraining the NN model without the data samples that have been forgotten; and (5) retraining the adversarial model for the next iteration.</abstract><oa>free_for_read</oa></addata></record> |
fulltext | fulltext_linktorsrc |
identifier | |
ispartof | |
issn | |
language | eng |
recordid | cdi_epo_espacenet_US2022300822A1 |
source | esp@cenet |
subjects | CALCULATING COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS COMPUTING COUNTING PHYSICS |
title | FORGETTING DATA SAMPLES FROM PRETRAINED NEURAL NETWORK MODELS |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-18T14%3A40%3A39IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-epo_EVB&rft_val_fmt=info:ofi/fmt:kev:mtx:patent&rft.genre=patent&rft.au=FARKASH,%20Ariel&rft.date=2022-09-22&rft_id=info:doi/&rft_dat=%3Cepo_EVB%3EUS2022300822A1%3C/epo_EVB%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true |