Abstract
Training data for machine translation (MT) is often sourced from a multitude of large corpora that are multi-faceted in nature, e.g. containing contents from multiple domains or different levels of quality or complexity. Naturally, these facets do not occur with equal frequency, nor are they equally important for the test scenario at hand. In this work, we propose to optimize this balance jointly with MT model parameters to relieve system developers from manual schedule design. A multi-armed bandit is trained to dynamically choose between facets in a way that is most beneficial for the MT system. We evaluate it on three different multi-facet applications: balancing translationese and natural training data, or data from multiple domains or multiple language pairs. We find that bandit learning leads to competitive MT systems across tasks, and our analysis provides insights into its learned strategies and the underlying data sets.- Anthology ID:
- 2021.findings-emnlp.274
- Volume:
- Findings of the Association for Computational Linguistics: EMNLP 2021
- Month:
- November
- Year:
- 2021
- Address:
- Punta Cana, Dominican Republic
- Editors:
- Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
- Venue:
- Findings
- SIG:
- SIGDAT
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 3190–3204
- Language:
- URL:
- https://aclanthology.org/2021.findings-emnlp.274
- DOI:
- 10.18653/v1/2021.findings-emnlp.274
- Cite (ACL):
- Julia Kreutzer, David Vilar, and Artem Sokolov. 2021. Bandits Don’t Follow Rules: Balancing Multi-Facet Machine Translation with Multi-Armed Bandits. In Findings of the Association for Computational Linguistics: EMNLP 2021, pages 3190–3204, Punta Cana, Dominican Republic. Association for Computational Linguistics.
- Cite (Informal):
- Bandits Don’t Follow Rules: Balancing Multi-Facet Machine Translation with Multi-Armed Bandits (Kreutzer et al., Findings 2021)
- PDF:
- https://preview.aclanthology.org/nschneid-patch-3/2021.findings-emnlp.274.pdf
- Data
- OPUS-100