Towards Robust k-Nearest-Neighbor Machine Translation
k-Nearest-Neighbor Machine Translation (kNN-MT) becomes an important research direction of NMT in recent years. Its main idea is to retrieve useful key-value pairs from an additional datastore to modify translations without updating the NMT model. However, the underlying retrieved noisy pairs will d...
Gespeichert in:
Hauptverfasser: | , , , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | |
---|---|
container_issue | |
container_start_page | |
container_title | |
container_volume | |
creator | Jiang, Hui Lu, Ziyao Meng, Fandong Zhou, Chulun Zhou, Jie Huang, Degen Su, Jinsong |
description | k-Nearest-Neighbor Machine Translation (kNN-MT) becomes an important research
direction of NMT in recent years. Its main idea is to retrieve useful key-value
pairs from an additional datastore to modify translations without updating the
NMT model. However, the underlying retrieved noisy pairs will dramatically
deteriorate the model performance. In this paper, we conduct a preliminary
study and find that this problem results from not fully exploiting the
prediction of the NMT model. To alleviate the impact of noise, we propose a
confidence-enhanced kNN-MT model with robust training. Concretely, we introduce
the NMT confidence to refine the modeling of two important components of
kNN-MT: kNN distribution and the interpolation weight. Meanwhile we inject two
types of perturbations into the retrieved pairs for robust training.
Experimental results on four benchmark datasets demonstrate that our model not
only achieves significant improvements over current kNN-MT models, but also
exhibits better robustness. Our code is available at
https://github.com/DeepLearnXMU/Robust-knn-mt. |
doi_str_mv | 10.48550/arxiv.2210.08808 |
format | Article |
fullrecord | <record><control><sourceid>arxiv_GOX</sourceid><recordid>TN_cdi_arxiv_primary_2210_08808</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2210_08808</sourcerecordid><originalsourceid>FETCH-LOGICAL-a678-6de85f2b493d417d2f2f0d2951266c8ac4b6d2792504beb09ee081e78fe324023</originalsourceid><addsrcrecordid>eNotzstOwzAQhWFvWKDCA7AiL5AymdjOZIkqblIBCWUfjeNxa7UkyA63t6cUVr90FkefUhcVLDUZA1ecvuLHEvEwABHQqTLd9MnJ5-Jlcu95Lnblk3CSPB8aN1s3peKRh20cpegSj3nPc5zGM3USeJ_l_L8L1d3edKv7cv1897C6XpdsGyqtFzIBnW5rr6vGY8AAHltTobUD8aCd9di0aEA7cdCKAFXSUJAaNWC9UJd_t0d3_5biK6fv_tffH_31D5M1P0w</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>Towards Robust k-Nearest-Neighbor Machine Translation</title><source>arXiv.org</source><creator>Jiang, Hui ; Lu, Ziyao ; Meng, Fandong ; Zhou, Chulun ; Zhou, Jie ; Huang, Degen ; Su, Jinsong</creator><creatorcontrib>Jiang, Hui ; Lu, Ziyao ; Meng, Fandong ; Zhou, Chulun ; Zhou, Jie ; Huang, Degen ; Su, Jinsong</creatorcontrib><description>k-Nearest-Neighbor Machine Translation (kNN-MT) becomes an important research
direction of NMT in recent years. Its main idea is to retrieve useful key-value
pairs from an additional datastore to modify translations without updating the
NMT model. However, the underlying retrieved noisy pairs will dramatically
deteriorate the model performance. In this paper, we conduct a preliminary
study and find that this problem results from not fully exploiting the
prediction of the NMT model. To alleviate the impact of noise, we propose a
confidence-enhanced kNN-MT model with robust training. Concretely, we introduce
the NMT confidence to refine the modeling of two important components of
kNN-MT: kNN distribution and the interpolation weight. Meanwhile we inject two
types of perturbations into the retrieved pairs for robust training.
Experimental results on four benchmark datasets demonstrate that our model not
only achieves significant improvements over current kNN-MT models, but also
exhibits better robustness. Our code is available at
https://github.com/DeepLearnXMU/Robust-knn-mt.</description><identifier>DOI: 10.48550/arxiv.2210.08808</identifier><language>eng</language><subject>Computer Science - Computation and Language</subject><creationdate>2022-10</creationdate><rights>http://creativecommons.org/licenses/by/4.0</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>228,230,780,885</link.rule.ids><linktorsrc>$$Uhttps://arxiv.org/abs/2210.08808$$EView_record_in_Cornell_University$$FView_record_in_$$GCornell_University$$Hfree_for_read</linktorsrc><backlink>$$Uhttps://doi.org/10.48550/arXiv.2210.08808$$DView paper in arXiv$$Hfree_for_read</backlink></links><search><creatorcontrib>Jiang, Hui</creatorcontrib><creatorcontrib>Lu, Ziyao</creatorcontrib><creatorcontrib>Meng, Fandong</creatorcontrib><creatorcontrib>Zhou, Chulun</creatorcontrib><creatorcontrib>Zhou, Jie</creatorcontrib><creatorcontrib>Huang, Degen</creatorcontrib><creatorcontrib>Su, Jinsong</creatorcontrib><title>Towards Robust k-Nearest-Neighbor Machine Translation</title><description>k-Nearest-Neighbor Machine Translation (kNN-MT) becomes an important research
direction of NMT in recent years. Its main idea is to retrieve useful key-value
pairs from an additional datastore to modify translations without updating the
NMT model. However, the underlying retrieved noisy pairs will dramatically
deteriorate the model performance. In this paper, we conduct a preliminary
study and find that this problem results from not fully exploiting the
prediction of the NMT model. To alleviate the impact of noise, we propose a
confidence-enhanced kNN-MT model with robust training. Concretely, we introduce
the NMT confidence to refine the modeling of two important components of
kNN-MT: kNN distribution and the interpolation weight. Meanwhile we inject two
types of perturbations into the retrieved pairs for robust training.
Experimental results on four benchmark datasets demonstrate that our model not
only achieves significant improvements over current kNN-MT models, but also
exhibits better robustness. Our code is available at
https://github.com/DeepLearnXMU/Robust-knn-mt.</description><subject>Computer Science - Computation and Language</subject><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2022</creationdate><recordtype>article</recordtype><sourceid>GOX</sourceid><recordid>eNotzstOwzAQhWFvWKDCA7AiL5AymdjOZIkqblIBCWUfjeNxa7UkyA63t6cUVr90FkefUhcVLDUZA1ecvuLHEvEwABHQqTLd9MnJ5-Jlcu95Lnblk3CSPB8aN1s3peKRh20cpegSj3nPc5zGM3USeJ_l_L8L1d3edKv7cv1897C6XpdsGyqtFzIBnW5rr6vGY8AAHltTobUD8aCd9di0aEA7cdCKAFXSUJAaNWC9UJd_t0d3_5biK6fv_tffH_31D5M1P0w</recordid><startdate>20221017</startdate><enddate>20221017</enddate><creator>Jiang, Hui</creator><creator>Lu, Ziyao</creator><creator>Meng, Fandong</creator><creator>Zhou, Chulun</creator><creator>Zhou, Jie</creator><creator>Huang, Degen</creator><creator>Su, Jinsong</creator><scope>AKY</scope><scope>GOX</scope></search><sort><creationdate>20221017</creationdate><title>Towards Robust k-Nearest-Neighbor Machine Translation</title><author>Jiang, Hui ; Lu, Ziyao ; Meng, Fandong ; Zhou, Chulun ; Zhou, Jie ; Huang, Degen ; Su, Jinsong</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-a678-6de85f2b493d417d2f2f0d2951266c8ac4b6d2792504beb09ee081e78fe324023</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2022</creationdate><topic>Computer Science - Computation and Language</topic><toplevel>online_resources</toplevel><creatorcontrib>Jiang, Hui</creatorcontrib><creatorcontrib>Lu, Ziyao</creatorcontrib><creatorcontrib>Meng, Fandong</creatorcontrib><creatorcontrib>Zhou, Chulun</creatorcontrib><creatorcontrib>Zhou, Jie</creatorcontrib><creatorcontrib>Huang, Degen</creatorcontrib><creatorcontrib>Su, Jinsong</creatorcontrib><collection>arXiv Computer Science</collection><collection>arXiv.org</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Jiang, Hui</au><au>Lu, Ziyao</au><au>Meng, Fandong</au><au>Zhou, Chulun</au><au>Zhou, Jie</au><au>Huang, Degen</au><au>Su, Jinsong</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Towards Robust k-Nearest-Neighbor Machine Translation</atitle><date>2022-10-17</date><risdate>2022</risdate><abstract>k-Nearest-Neighbor Machine Translation (kNN-MT) becomes an important research
direction of NMT in recent years. Its main idea is to retrieve useful key-value
pairs from an additional datastore to modify translations without updating the
NMT model. However, the underlying retrieved noisy pairs will dramatically
deteriorate the model performance. In this paper, we conduct a preliminary
study and find that this problem results from not fully exploiting the
prediction of the NMT model. To alleviate the impact of noise, we propose a
confidence-enhanced kNN-MT model with robust training. Concretely, we introduce
the NMT confidence to refine the modeling of two important components of
kNN-MT: kNN distribution and the interpolation weight. Meanwhile we inject two
types of perturbations into the retrieved pairs for robust training.
Experimental results on four benchmark datasets demonstrate that our model not
only achieves significant improvements over current kNN-MT models, but also
exhibits better robustness. Our code is available at
https://github.com/DeepLearnXMU/Robust-knn-mt.</abstract><doi>10.48550/arxiv.2210.08808</doi><oa>free_for_read</oa></addata></record> |
fulltext | fulltext_linktorsrc |
identifier | DOI: 10.48550/arxiv.2210.08808 |
ispartof | |
issn | |
language | eng |
recordid | cdi_arxiv_primary_2210_08808 |
source | arXiv.org |
subjects | Computer Science - Computation and Language |
title | Towards Robust k-Nearest-Neighbor Machine Translation |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-21T05%3A37%3A23IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-arxiv_GOX&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Towards%20Robust%20k-Nearest-Neighbor%20Machine%20Translation&rft.au=Jiang,%20Hui&rft.date=2022-10-17&rft_id=info:doi/10.48550/arxiv.2210.08808&rft_dat=%3Carxiv_GOX%3E2210_08808%3C/arxiv_GOX%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true |