Abstract
Finetuning pretrained models on downstream generation tasks often leads to catastrophic forgetting in zero-shot conditions. In this work, we focus on summarization and tackle the problem through the lens of language-independent representations. After training on monolingual summarization, we perform zero-shot transfer to new languages or language pairs. We first show naively finetuned models are highly language-specific in both output behavior and internal representations, resulting in poor zero-shot performance. Next, we propose query-key (QK) finetuning to decouple task-specific knowledge from the pretrained language generation abilities. Then, after showing downsides of the standard adversarial language classifier, we propose a balanced variant that more directly enforces language-agnostic representations. Moreover, our qualitative analyses show removing source language identity correlates to zero-shot summarization performance. Our code is openly available.- Anthology ID:
- 2024.naacl-short.68
- Volume:
- Proceedings of the 2024 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 2: Short Papers)
- Month:
- June
- Year:
- 2024
- Address:
- Mexico City, Mexico
- Editors:
- Kevin Duh, Helena Gomez, Steven Bethard
- Venue:
- NAACL
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 772–782
- Language:
- URL:
- https://aclanthology.org/2024.naacl-short.68
- DOI:
- 10.18653/v1/2024.naacl-short.68
- Cite (ACL):
- Vladimir Solovyev, Danni Liu, and Jan Niehues. 2024. Language-Independent Representations Improve Zero-Shot Summarization. In Proceedings of the 2024 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 2: Short Papers), pages 772–782, Mexico City, Mexico. Association for Computational Linguistics.
- Cite (Informal):
- Language-Independent Representations Improve Zero-Shot Summarization (Solovyev et al., NAACL 2024)
- PDF:
- https://preview.aclanthology.org/add_acl24_videos/2024.naacl-short.68.pdf