Some Tradeoffs in Continual Learning for Parliamentary Neural Machine Translation Systems

Rebecca Knowles, Samuel Larkin, Michel Simard, Marc A Tessier, Gabriel Bernier-Colborne, Cyril Goutte, Chi-kiu Lo


Abstract
In long-term translation projects, like Parliamentary text, there is a desire to build machine translation systems that can adapt to changes over time. We implement and examine a simple approach to continual learning for neural machine translation, exploring tradeoffs between consistency, the model’s ability to learn from incoming data, and the time a client would need to wait to obtain a newly trained translation system.
Anthology ID:
2024.amta-research.10
Volume:
Proceedings of the 16th Conference of the Association for Machine Translation in the Americas (Volume 1: Research Track)
Month:
September
Year:
2024
Address:
Chicago, USA
Editors:
Rebecca Knowles, Akiko Eriguchi, Shivali Goel
Venue:
AMTA
SIG:
Publisher:
Association for Machine Translation in the Americas
Note:
Pages:
102–118
Language:
URL:
https://aclanthology.org/2024.amta-research.10
DOI:
Bibkey:
Cite (ACL):
Rebecca Knowles, Samuel Larkin, Michel Simard, Marc A Tessier, Gabriel Bernier-Colborne, Cyril Goutte, and Chi-kiu Lo. 2024. Some Tradeoffs in Continual Learning for Parliamentary Neural Machine Translation Systems. In Proceedings of the 16th Conference of the Association for Machine Translation in the Americas (Volume 1: Research Track), pages 102–118, Chicago, USA. Association for Machine Translation in the Americas.
Cite (Informal):
Some Tradeoffs in Continual Learning for Parliamentary Neural Machine Translation Systems (Knowles et al., AMTA 2024)
Copy Citation:
PDF:
https://preview.aclanthology.org/landing_page/2024.amta-research.10.pdf