Dynamic Curriculum Learning for Low-Resource Neural Machine Translation

Chen Xu, Bojie Hu, Yufan Jiang, Kai Feng, Zeyang Wang, Shen Huang, Qi Ju, Tong Xiao, Jingbo Zhu


Abstract
Large amounts of data has made neural machine translation (NMT) a big success in recent years. But it is still a challenge if we train these models on small-scale corpora. In this case, the way of using data appears to be more important. Here, we investigate the effective use of training data for low-resource NMT. In particular, we propose a dynamic curriculum learning (DCL) method to reorder training samples in training. Unlike previous work, we do not use a static scoring function for reordering. Instead, the order of training samples is dynamically determined in two ways - loss decline and model competence. This eases training by highlighting easy samples that the current model has enough competence to learn. We test our DCL method in a Transformer-based system. Experimental results show that DCL outperforms several strong baselines on three low-resource machine translation benchmarks and different sized data of WMT’16 En-De.
Anthology ID:
2020.coling-main.352
Volume:
Proceedings of the 28th International Conference on Computational Linguistics
Month:
December
Year:
2020
Address:
Barcelona, Spain (Online)
Editors:
Donia Scott, Nuria Bel, Chengqing Zong
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
3977–3989
Language:
URL:
https://aclanthology.org/2020.coling-main.352
DOI:
10.18653/v1/2020.coling-main.352
Bibkey:
Cite (ACL):
Chen Xu, Bojie Hu, Yufan Jiang, Kai Feng, Zeyang Wang, Shen Huang, Qi Ju, Tong Xiao, and Jingbo Zhu. 2020. Dynamic Curriculum Learning for Low-Resource Neural Machine Translation. In Proceedings of the 28th International Conference on Computational Linguistics, pages 3977–3989, Barcelona, Spain (Online). International Committee on Computational Linguistics.
Cite (Informal):
Dynamic Curriculum Learning for Low-Resource Neural Machine Translation (Xu et al., COLING 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.coling-main.352.pdf