Can Monolingual Pre-trained Encoder-Decoder Improve NMT for Distant Language Pairs?

Hwichan Kim, Mamoru Komachi


Anthology ID:
2021.paclic-1.25
Volume:
Proceedings of the 35th Pacific Asia Conference on Language, Information and Computation
Month:
11
Year:
2021
Address:
Shanghai, China
Venue:
PACLIC
SIG:
Publisher:
Association for Computational Lingustics
Note:
Pages:
235–243
Language:
URL:
https://aclanthology.org/2021.paclic-1.25
DOI:
Bibkey:
Cite (ACL):
Hwichan Kim and Mamoru Komachi. 2021. Can Monolingual Pre-trained Encoder-Decoder Improve NMT for Distant Language Pairs?. In Proceedings of the 35th Pacific Asia Conference on Language, Information and Computation, pages 235–243, Shanghai, China. Association for Computational Lingustics.
Cite (Informal):
Can Monolingual Pre-trained Encoder-Decoder Improve NMT for Distant Language Pairs? (Kim & Komachi, PACLIC 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/auto-file-uploads/2021.paclic-1.25.pdf