On the differences between BERT and MT encoder spaces and how to address them in translation tasks
Raúl Vázquez, Hande Celikkanat, Mathias Creutz, Jörg Tiedemann
Abstract
Various studies show that pretrained language models such as BERT cannot straightforwardly replace encoders in neural machine translation despite their enormous success in other tasks. This is even more astonishing considering the similarities between the architectures. This paper sheds some light on the embedding spaces they create, using average cosine similarity, contextuality metrics and measures for representational similarity for comparison, revealing that BERT and NMT encoder representations look significantly different from one another. In order to address this issue, we propose a supervised transformation from one into the other using explicit alignment and fine-tuning. Our results demonstrate the need for such a transformation to improve the applicability of BERT in MT.- Anthology ID:
- 2021.acl-srw.35
- Volume:
- Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing: Student Research Workshop
- Month:
- August
- Year:
- 2021
- Address:
- Online
- Editors:
- Jad Kabbara, Haitao Lin, Amandalynne Paullada, Jannis Vamvas
- Venues:
- ACL | IJCNLP
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 337–347
- Language:
- URL:
- https://preview.aclanthology.org/build-pipeline-with-new-library/2021.acl-srw.35/
- DOI:
- 10.18653/v1/2021.acl-srw.35
- Cite (ACL):
- Raúl Vázquez, Hande Celikkanat, Mathias Creutz, and Jörg Tiedemann. 2021. On the differences between BERT and MT encoder spaces and how to address them in translation tasks. In Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing: Student Research Workshop, pages 337–347, Online. Association for Computational Linguistics.
- Cite (Informal):
- On the differences between BERT and MT encoder spaces and how to address them in translation tasks (Vázquez et al., ACL-IJCNLP 2021)
- PDF:
- https://preview.aclanthology.org/build-pipeline-with-new-library/2021.acl-srw.35.pdf
- Data
- MuST-C