Language Anisotropic Cross-Lingual Model Editing

Yang Xu, Yutai Hou, Wanxiang Che, Min Zhang


Abstract
Multilingual pre-trained language models can learn task-specific abilities or memorize facts across multiple languages but inevitably make undesired predictions with specific inputs. Under similar observation, model editing aims to post-hoc calibrate a model targeted to specific inputs with keeping the model’s raw behavior. However, existing work only studies the monolingual scenario, which lacks the cross-lingual transferability to perform editing simultaneously across languages. In this work, we focus on cross-lingual model editing. Firstly, we define the cross-lingual model editing task and corresponding metrics, where an edit in one language propagates to the others. Next, we propose a framework to naturally adapt monolingual model editing approaches to the cross-lingual scenario using parallel corpus. Further, we propose language anisotropic editing to improve cross-lingual editing by amplifying different subsets of parameters for each language. On the newly defined cross-lingual model editing task, we empirically demonstrate the failure of monolingual baselines in propagating the edit to multiple languages and the effectiveness of the proposed language anisotropic model editing. Our code is publicly available at https://github.com/franklear/LiME.
Anthology ID:
2023.findings-acl.343
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5554–5569
Language:
URL:
https://aclanthology.org/2023.findings-acl.343
DOI:
10.18653/v1/2023.findings-acl.343
Bibkey:
Cite (ACL):
Yang Xu, Yutai Hou, Wanxiang Che, and Min Zhang. 2023. Language Anisotropic Cross-Lingual Model Editing. In Findings of the Association for Computational Linguistics: ACL 2023, pages 5554–5569, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Language Anisotropic Cross-Lingual Model Editing (Xu et al., Findings 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/emnlp-22-attachments/2023.findings-acl.343.pdf