POSTECH Submission on Duolingo Shared Task
Junsu Park, Hongseok Kwon, Jong-Hyeok Lee
Abstract
In this paper, we propose a transfer learning based simultaneous translation model by extending BART. We pre-trained BART with Korean Wikipedia and a Korean news dataset, and fine-tuned with an additional web-crawled parallel corpus and the 2020 Duolingo official training dataset. In our experiments on the 2020 Duolingo test dataset, our submission achieves 0.312 in weighted macro F1 score, and ranks second among the submitted En-Ko systems.- Anthology ID:
- 2020.ngt-1.16
- Volume:
- Proceedings of the Fourth Workshop on Neural Generation and Translation
- Month:
- July
- Year:
- 2020
- Address:
- Online
- Venues:
- ACL | NGT | WS
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 139–143
- URL:
- https://www.aclweb.org/anthology/2020.ngt-1.16
- DOI:
- PDF:
- https://www.aclweb.org/anthology/2020.ngt-1.16.pdf
You can write comments here (and agree to place them under CC-by). They are not guaranteed to stay and there is no e-mail functionality.