Abstract
This paper presents an analysis of the pre-trained Transformer model Neural Machine Translation (NMT) for the Ancient-Chinese-to-Modern-Chinese machine translation task.- Anthology ID:
- 2023.alt-1.3
- Volume:
- Proceedings of ALT2023: Ancient Language Translation Workshop
- Month:
- September
- Year:
- 2023
- Address:
- Macau SAR, China
- Venue:
- alt
- SIG:
- Publisher:
- Asia-Pacific Association for Machine Translation
- Note:
- Pages:
- 23–28
- Language:
- URL:
- https://aclanthology.org/2023.alt-1.3
- DOI:
- Cite (ACL):
- Jiahui Wang, Xuqin Zhang, Jiahuan Li, and Shujian Huang. 2023. Pre-trained Model In Ancient-Chinese-to-Modern-Chinese Machine Translation. In Proceedings of ALT2023: Ancient Language Translation Workshop, pages 23–28, Macau SAR, China. Asia-Pacific Association for Machine Translation.
- Cite (Informal):
- Pre-trained Model In Ancient-Chinese-to-Modern-Chinese Machine Translation (Wang et al., alt 2023)
- PDF:
- https://preview.aclanthology.org/nschneid-patch-4/2023.alt-1.3.pdf