The Resistance to Label Noise in K-NN and DNN Depends on its Concentration
We investigate the classification performance of K-nearest neighbors (K-NN) and deep neural networks (DNNs) in the presence of label noise. We first show empirically that a DNN's prediction for a given test example depends on the labels of the training examples in its local neighborhood. This m...
Gespeichert in:
Hauptverfasser: | , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | |
---|---|
container_issue | |
container_start_page | |
container_title | |
container_volume | |
creator | Drory, Amnon Ratzon, Oria Avidan, Shai Giryes, Raja |
description | We investigate the classification performance of K-nearest neighbors (K-NN)
and deep neural networks (DNNs) in the presence of label noise. We first show
empirically that a DNN's prediction for a given test example depends on the
labels of the training examples in its local neighborhood. This motivates us to
derive a realizable analytic expression that approximates the multi-class K-NN
classification error in the presence of label noise, which is of independent
importance. We then suggest that the expression for K-NN may serve as a
first-order approximation for the DNN error. Finally, we demonstrate
empirically the proximity of the developed expression to the observed
performance of K-NN and DNN classifiers. Our result may explain the already
observed surprising resistance of DNN to some types of label noise. It also
characterizes an important factor of it showing that the more concentrated the
noise the greater is the degradation in performance. |
doi_str_mv | 10.48550/arxiv.1803.11410 |
format | Article |
fullrecord | <record><control><sourceid>arxiv_GOX</sourceid><recordid>TN_cdi_arxiv_primary_1803_11410</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>1803_11410</sourcerecordid><originalsourceid>FETCH-LOGICAL-a670-7f04e31a4657b8844833fc73bd6579b721933753da04294bf8f9bbf9407e17303</originalsourceid><addsrcrecordid>eNotj81KxDAURrNxIaMP4Mr7Aq3J3HSSLIeO_6WCdF-S6Q0TGJOhCaJvbx1dHfjgO3AYuxG8lrpp-J2dv8JnLTTHWggp-CV7GQ4E75RDLjbuCUqCzjo6Qp9CJggRXqu-Bxsn2C3c0YnilCFFCCVDm5ZPLLMtIcUrduHtMdP1P1dseLgf2qeqe3t8brddZTeKV8pzSSis3DTKaS2lRvR7hW5aBuPUWhhE1eBkuVwb6bz2xjlvJFckFHJcsds_7TlmPM3hw87f42_UeI7CH6KKRGU</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>The Resistance to Label Noise in K-NN and DNN Depends on its Concentration</title><source>arXiv.org</source><creator>Drory, Amnon ; Ratzon, Oria ; Avidan, Shai ; Giryes, Raja</creator><creatorcontrib>Drory, Amnon ; Ratzon, Oria ; Avidan, Shai ; Giryes, Raja</creatorcontrib><description>We investigate the classification performance of K-nearest neighbors (K-NN)
and deep neural networks (DNNs) in the presence of label noise. We first show
empirically that a DNN's prediction for a given test example depends on the
labels of the training examples in its local neighborhood. This motivates us to
derive a realizable analytic expression that approximates the multi-class K-NN
classification error in the presence of label noise, which is of independent
importance. We then suggest that the expression for K-NN may serve as a
first-order approximation for the DNN error. Finally, we demonstrate
empirically the proximity of the developed expression to the observed
performance of K-NN and DNN classifiers. Our result may explain the already
observed surprising resistance of DNN to some types of label noise. It also
characterizes an important factor of it showing that the more concentrated the
noise the greater is the degradation in performance.</description><identifier>DOI: 10.48550/arxiv.1803.11410</identifier><language>eng</language><subject>Computer Science - Computer Vision and Pattern Recognition ; Computer Science - Learning ; Computer Science - Neural and Evolutionary Computing ; Statistics - Machine Learning</subject><creationdate>2018-03</creationdate><rights>http://arxiv.org/licenses/nonexclusive-distrib/1.0</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>228,230,780,885</link.rule.ids><linktorsrc>$$Uhttps://arxiv.org/abs/1803.11410$$EView_record_in_Cornell_University$$FView_record_in_$$GCornell_University$$Hfree_for_read</linktorsrc><backlink>$$Uhttps://doi.org/10.48550/arXiv.1803.11410$$DView paper in arXiv$$Hfree_for_read</backlink></links><search><creatorcontrib>Drory, Amnon</creatorcontrib><creatorcontrib>Ratzon, Oria</creatorcontrib><creatorcontrib>Avidan, Shai</creatorcontrib><creatorcontrib>Giryes, Raja</creatorcontrib><title>The Resistance to Label Noise in K-NN and DNN Depends on its Concentration</title><description>We investigate the classification performance of K-nearest neighbors (K-NN)
and deep neural networks (DNNs) in the presence of label noise. We first show
empirically that a DNN's prediction for a given test example depends on the
labels of the training examples in its local neighborhood. This motivates us to
derive a realizable analytic expression that approximates the multi-class K-NN
classification error in the presence of label noise, which is of independent
importance. We then suggest that the expression for K-NN may serve as a
first-order approximation for the DNN error. Finally, we demonstrate
empirically the proximity of the developed expression to the observed
performance of K-NN and DNN classifiers. Our result may explain the already
observed surprising resistance of DNN to some types of label noise. It also
characterizes an important factor of it showing that the more concentrated the
noise the greater is the degradation in performance.</description><subject>Computer Science - Computer Vision and Pattern Recognition</subject><subject>Computer Science - Learning</subject><subject>Computer Science - Neural and Evolutionary Computing</subject><subject>Statistics - Machine Learning</subject><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2018</creationdate><recordtype>article</recordtype><sourceid>GOX</sourceid><recordid>eNotj81KxDAURrNxIaMP4Mr7Aq3J3HSSLIeO_6WCdF-S6Q0TGJOhCaJvbx1dHfjgO3AYuxG8lrpp-J2dv8JnLTTHWggp-CV7GQ4E75RDLjbuCUqCzjo6Qp9CJggRXqu-Bxsn2C3c0YnilCFFCCVDm5ZPLLMtIcUrduHtMdP1P1dseLgf2qeqe3t8brddZTeKV8pzSSis3DTKaS2lRvR7hW5aBuPUWhhE1eBkuVwb6bz2xjlvJFckFHJcsds_7TlmPM3hw87f42_UeI7CH6KKRGU</recordid><startdate>20180330</startdate><enddate>20180330</enddate><creator>Drory, Amnon</creator><creator>Ratzon, Oria</creator><creator>Avidan, Shai</creator><creator>Giryes, Raja</creator><scope>AKY</scope><scope>EPD</scope><scope>GOX</scope></search><sort><creationdate>20180330</creationdate><title>The Resistance to Label Noise in K-NN and DNN Depends on its Concentration</title><author>Drory, Amnon ; Ratzon, Oria ; Avidan, Shai ; Giryes, Raja</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-a670-7f04e31a4657b8844833fc73bd6579b721933753da04294bf8f9bbf9407e17303</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2018</creationdate><topic>Computer Science - Computer Vision and Pattern Recognition</topic><topic>Computer Science - Learning</topic><topic>Computer Science - Neural and Evolutionary Computing</topic><topic>Statistics - Machine Learning</topic><toplevel>online_resources</toplevel><creatorcontrib>Drory, Amnon</creatorcontrib><creatorcontrib>Ratzon, Oria</creatorcontrib><creatorcontrib>Avidan, Shai</creatorcontrib><creatorcontrib>Giryes, Raja</creatorcontrib><collection>arXiv Computer Science</collection><collection>arXiv Statistics</collection><collection>arXiv.org</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Drory, Amnon</au><au>Ratzon, Oria</au><au>Avidan, Shai</au><au>Giryes, Raja</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>The Resistance to Label Noise in K-NN and DNN Depends on its Concentration</atitle><date>2018-03-30</date><risdate>2018</risdate><abstract>We investigate the classification performance of K-nearest neighbors (K-NN)
and deep neural networks (DNNs) in the presence of label noise. We first show
empirically that a DNN's prediction for a given test example depends on the
labels of the training examples in its local neighborhood. This motivates us to
derive a realizable analytic expression that approximates the multi-class K-NN
classification error in the presence of label noise, which is of independent
importance. We then suggest that the expression for K-NN may serve as a
first-order approximation for the DNN error. Finally, we demonstrate
empirically the proximity of the developed expression to the observed
performance of K-NN and DNN classifiers. Our result may explain the already
observed surprising resistance of DNN to some types of label noise. It also
characterizes an important factor of it showing that the more concentrated the
noise the greater is the degradation in performance.</abstract><doi>10.48550/arxiv.1803.11410</doi><oa>free_for_read</oa></addata></record> |
fulltext | fulltext_linktorsrc |
identifier | DOI: 10.48550/arxiv.1803.11410 |
ispartof | |
issn | |
language | eng |
recordid | cdi_arxiv_primary_1803_11410 |
source | arXiv.org |
subjects | Computer Science - Computer Vision and Pattern Recognition Computer Science - Learning Computer Science - Neural and Evolutionary Computing Statistics - Machine Learning |
title | The Resistance to Label Noise in K-NN and DNN Depends on its Concentration |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-11T12%3A30%3A08IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-arxiv_GOX&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=The%20Resistance%20to%20Label%20Noise%20in%20K-NN%20and%20DNN%20Depends%20on%20its%20Concentration&rft.au=Drory,%20Amnon&rft.date=2018-03-30&rft_id=info:doi/10.48550/arxiv.1803.11410&rft_dat=%3Carxiv_GOX%3E1803_11410%3C/arxiv_GOX%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true |