Mixed-Lingual Pre-training for Cross-lingual Summarization
Ruochen Xu, Chenguang Zhu, Yu Shi, Michael Zeng, Xuedong Huang
Abstract
Cross-lingual Summarization (CLS) aims at producing a summary in the target language for an article in the source language. Traditional solutions employ a two-step approach, i.e. translate -> summarize or summarize -> translate. Recently, end-to-end models have achieved better results, but these approaches are mostly limited by their dependence on large-scale labeled data. We propose a solution based on mixed-lingual pre-training that leverages both cross-lingual tasks such as translation and monolingual tasks like masked language models. Thus, our model can leverage the massive monolingual data to enhance its modeling of language. Moreover, the architecture has no task-specific components, which saves memory and increases optimization efficiency. We show in experiments that this pre-training scheme can effectively boost the performance of cross-lingual summarization. In NCLS dataset, our model achieves an improvement of 2.82 (English to Chinese) and 1.15 (Chinese to English) ROUGE-1 scores over state-of-the-art results.- Anthology ID:
- 2020.aacl-main.53
- Volume:
- Proceedings of the 1st Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 10th International Joint Conference on Natural Language Processing
- Month:
- December
- Year:
- 2020
- Address:
- Suzhou, China
- Editors:
- Kam-Fai Wong, Kevin Knight, Hua Wu
- Venue:
- AACL
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 536–541
- Language:
- URL:
- https://aclanthology.org/2020.aacl-main.53
- DOI:
- Cite (ACL):
- Ruochen Xu, Chenguang Zhu, Yu Shi, Michael Zeng, and Xuedong Huang. 2020. Mixed-Lingual Pre-training for Cross-lingual Summarization. In Proceedings of the 1st Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 10th International Joint Conference on Natural Language Processing, pages 536–541, Suzhou, China. Association for Computational Linguistics.
- Cite (Informal):
- Mixed-Lingual Pre-training for Cross-lingual Summarization (Xu et al., AACL 2020)
- PDF:
- https://preview.aclanthology.org/nschneid-patch-2/2020.aacl-main.53.pdf
- Data
- NCLS