Netmarble AI Center's WMT21 Automatic Post-Editing Shared Task Submission

This paper describes Netmarble's submission to WMT21 Automatic Post-Editing (APE) Shared Task for the English-German language pair. First, we propose a Curriculum Training Strategy in training stages. Facebook Fair's WMT19 news translation model was chosen to engage the large and powerful...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:arXiv.org 2021-11
Hauptverfasser: Oh, Shinhyeok, Jang, Sion, Hu, Xu, An, Shounan, Oh, Insoo
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title arXiv.org
container_volume
creator Oh, Shinhyeok
Jang, Sion
Hu, Xu
An, Shounan
Oh, Insoo
description This paper describes Netmarble's submission to WMT21 Automatic Post-Editing (APE) Shared Task for the English-German language pair. First, we propose a Curriculum Training Strategy in training stages. Facebook Fair's WMT19 news translation model was chosen to engage the large and powerful pre-trained neural networks. Then, we post-train the translation model with different levels of data at each training stages. As the training stages go on, we make the system learn to solve multiple tasks by adding extra information at different training stages gradually. We also show a way to utilize the additional data in large volume for APE tasks. For further improvement, we apply Multi-Task Learning Strategy with the Dynamic Weight Average during the fine-tuning stage. To fine-tune the APE corpus with limited data, we add some related subtasks to learn a unified representation. Finally, for better performance, we leverage external translations as augmented machine translation (MT) during the post-training and fine-tuning. As experimental results show, our APE system significantly improves the translations of provided MT results by -2.848 and +3.74 on the development dataset in terms of TER and BLEU, respectively. It also demonstrates its effectiveness on the test dataset with higher quality than the development dataset.
format Article
fullrecord <record><control><sourceid>proquest</sourceid><recordid>TN_cdi_proquest_journals_2598306992</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2598306992</sourcerecordid><originalsourceid>FETCH-proquest_journals_25983069923</originalsourceid><addsrcrecordid>eNqNykELgjAYgOERBEn5HwYdOgn6LU2PIkYeikCho8xcNdOt9m3_vw79gE7v4XlnxAPGoiDdAiyIjziEYQjJDuKYeaQ6CTtx042C5hUthLLCbJBejg1ENHdWT9zKKz1rtEHZSyvVndYPbkRPG45PWrtukohSqxWZ3_iIwv91Sdb7sikOwcvotxNo20E7o77UQpylLEyyDNh_1we0ODsG</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2598306992</pqid></control><display><type>article</type><title>Netmarble AI Center's WMT21 Automatic Post-Editing Shared Task Submission</title><source>Free E- Journals</source><creator>Oh, Shinhyeok ; Jang, Sion ; Hu, Xu ; An, Shounan ; Oh, Insoo</creator><creatorcontrib>Oh, Shinhyeok ; Jang, Sion ; Hu, Xu ; An, Shounan ; Oh, Insoo</creatorcontrib><description>This paper describes Netmarble's submission to WMT21 Automatic Post-Editing (APE) Shared Task for the English-German language pair. First, we propose a Curriculum Training Strategy in training stages. Facebook Fair's WMT19 news translation model was chosen to engage the large and powerful pre-trained neural networks. Then, we post-train the translation model with different levels of data at each training stages. As the training stages go on, we make the system learn to solve multiple tasks by adding extra information at different training stages gradually. We also show a way to utilize the additional data in large volume for APE tasks. For further improvement, we apply Multi-Task Learning Strategy with the Dynamic Weight Average during the fine-tuning stage. To fine-tune the APE corpus with limited data, we add some related subtasks to learn a unified representation. Finally, for better performance, we leverage external translations as augmented machine translation (MT) during the post-training and fine-tuning. As experimental results show, our APE system significantly improves the translations of provided MT results by -2.848 and +3.74 on the development dataset in terms of TER and BLEU, respectively. It also demonstrates its effectiveness on the test dataset with higher quality than the development dataset.</description><identifier>EISSN: 2331-8422</identifier><language>eng</language><publisher>Ithaca: Cornell University Library, arXiv.org</publisher><subject>Curricula ; Datasets ; Editing ; Machine translation ; Neural networks ; Training</subject><ispartof>arXiv.org, 2021-11</ispartof><rights>2021. This work is published under http://arxiv.org/licenses/nonexclusive-distrib/1.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>777,781</link.rule.ids></links><search><creatorcontrib>Oh, Shinhyeok</creatorcontrib><creatorcontrib>Jang, Sion</creatorcontrib><creatorcontrib>Hu, Xu</creatorcontrib><creatorcontrib>An, Shounan</creatorcontrib><creatorcontrib>Oh, Insoo</creatorcontrib><title>Netmarble AI Center's WMT21 Automatic Post-Editing Shared Task Submission</title><title>arXiv.org</title><description>This paper describes Netmarble's submission to WMT21 Automatic Post-Editing (APE) Shared Task for the English-German language pair. First, we propose a Curriculum Training Strategy in training stages. Facebook Fair's WMT19 news translation model was chosen to engage the large and powerful pre-trained neural networks. Then, we post-train the translation model with different levels of data at each training stages. As the training stages go on, we make the system learn to solve multiple tasks by adding extra information at different training stages gradually. We also show a way to utilize the additional data in large volume for APE tasks. For further improvement, we apply Multi-Task Learning Strategy with the Dynamic Weight Average during the fine-tuning stage. To fine-tune the APE corpus with limited data, we add some related subtasks to learn a unified representation. Finally, for better performance, we leverage external translations as augmented machine translation (MT) during the post-training and fine-tuning. As experimental results show, our APE system significantly improves the translations of provided MT results by -2.848 and +3.74 on the development dataset in terms of TER and BLEU, respectively. It also demonstrates its effectiveness on the test dataset with higher quality than the development dataset.</description><subject>Curricula</subject><subject>Datasets</subject><subject>Editing</subject><subject>Machine translation</subject><subject>Neural networks</subject><subject>Training</subject><issn>2331-8422</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2021</creationdate><recordtype>article</recordtype><sourceid>ABUWG</sourceid><sourceid>AFKRA</sourceid><sourceid>AZQEC</sourceid><sourceid>BENPR</sourceid><sourceid>CCPQU</sourceid><sourceid>DWQXO</sourceid><recordid>eNqNykELgjAYgOERBEn5HwYdOgn6LU2PIkYeikCho8xcNdOt9m3_vw79gE7v4XlnxAPGoiDdAiyIjziEYQjJDuKYeaQ6CTtx042C5hUthLLCbJBejg1ENHdWT9zKKz1rtEHZSyvVndYPbkRPG45PWrtukohSqxWZ3_iIwv91Sdb7sikOwcvotxNo20E7o77UQpylLEyyDNh_1we0ODsG</recordid><startdate>20211116</startdate><enddate>20211116</enddate><creator>Oh, Shinhyeok</creator><creator>Jang, Sion</creator><creator>Hu, Xu</creator><creator>An, Shounan</creator><creator>Oh, Insoo</creator><general>Cornell University Library, arXiv.org</general><scope>8FE</scope><scope>8FG</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>HCIFZ</scope><scope>L6V</scope><scope>M7S</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PTHSS</scope></search><sort><creationdate>20211116</creationdate><title>Netmarble AI Center's WMT21 Automatic Post-Editing Shared Task Submission</title><author>Oh, Shinhyeok ; Jang, Sion ; Hu, Xu ; An, Shounan ; Oh, Insoo</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-proquest_journals_25983069923</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2021</creationdate><topic>Curricula</topic><topic>Datasets</topic><topic>Editing</topic><topic>Machine translation</topic><topic>Neural networks</topic><topic>Training</topic><toplevel>online_resources</toplevel><creatorcontrib>Oh, Shinhyeok</creatorcontrib><creatorcontrib>Jang, Sion</creatorcontrib><creatorcontrib>Hu, Xu</creatorcontrib><creatorcontrib>An, Shounan</creatorcontrib><creatorcontrib>Oh, Insoo</creatorcontrib><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>Materials Science &amp; Engineering Collection</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Engineering Collection</collection><collection>Engineering Database</collection><collection>Publicly Available Content Database</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>Engineering Collection</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Oh, Shinhyeok</au><au>Jang, Sion</au><au>Hu, Xu</au><au>An, Shounan</au><au>Oh, Insoo</au><format>book</format><genre>document</genre><ristype>GEN</ristype><atitle>Netmarble AI Center's WMT21 Automatic Post-Editing Shared Task Submission</atitle><jtitle>arXiv.org</jtitle><date>2021-11-16</date><risdate>2021</risdate><eissn>2331-8422</eissn><abstract>This paper describes Netmarble's submission to WMT21 Automatic Post-Editing (APE) Shared Task for the English-German language pair. First, we propose a Curriculum Training Strategy in training stages. Facebook Fair's WMT19 news translation model was chosen to engage the large and powerful pre-trained neural networks. Then, we post-train the translation model with different levels of data at each training stages. As the training stages go on, we make the system learn to solve multiple tasks by adding extra information at different training stages gradually. We also show a way to utilize the additional data in large volume for APE tasks. For further improvement, we apply Multi-Task Learning Strategy with the Dynamic Weight Average during the fine-tuning stage. To fine-tune the APE corpus with limited data, we add some related subtasks to learn a unified representation. Finally, for better performance, we leverage external translations as augmented machine translation (MT) during the post-training and fine-tuning. As experimental results show, our APE system significantly improves the translations of provided MT results by -2.848 and +3.74 on the development dataset in terms of TER and BLEU, respectively. It also demonstrates its effectiveness on the test dataset with higher quality than the development dataset.</abstract><cop>Ithaca</cop><pub>Cornell University Library, arXiv.org</pub><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier EISSN: 2331-8422
ispartof arXiv.org, 2021-11
issn 2331-8422
language eng
recordid cdi_proquest_journals_2598306992
source Free E- Journals
subjects Curricula
Datasets
Editing
Machine translation
Neural networks
Training
title Netmarble AI Center's WMT21 Automatic Post-Editing Shared Task Submission
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-21T04%3A49%3A57IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=document&rft.atitle=Netmarble%20AI%20Center's%20WMT21%20Automatic%20Post-Editing%20Shared%20Task%20Submission&rft.jtitle=arXiv.org&rft.au=Oh,%20Shinhyeok&rft.date=2021-11-16&rft.eissn=2331-8422&rft_id=info:doi/&rft_dat=%3Cproquest%3E2598306992%3C/proquest%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2598306992&rft_id=info:pmid/&rfr_iscdi=true