Cross-Lingual Transfer with Language-Specific Subnetworks for Low-Resource Dependency Parsing

Rochelle Choenni, Dan Garrette, Ekaterina Shutova


Abstract
Large multilingual language models typically share their parameters across all languages, which enables cross-lingual task transfer, but learning can also be hindered when training updates from different languages are in conflict. In this article, we propose novel methods for using language-specific subnetworks, which control cross-lingual parameter sharing, to reduce conflicts and increase positive transfer during fine-tuning. We introduce dynamic subnetworks, which are jointly updated with the model, and we combine our methods with meta-learning, an established, but complementary, technique for improving cross-lingual transfer. Finally, we provide extensive analyses of how each of our methods affects the models.
Anthology ID:
2023.cl-3.3
Volume:
Computational Linguistics, Volume 49, Issue 3 - September 2023
Month:
September
Year:
2023
Address:
Cambridge, MA
Venue:
CL
SIG:
Publisher:
MIT Press
Note:
Pages:
613–641
Language:
URL:
https://aclanthology.org/2023.cl-3.3
DOI:
10.1162/coli_a_00482
Bibkey:
Cite (ACL):
Rochelle Choenni, Dan Garrette, and Ekaterina Shutova. 2023. Cross-Lingual Transfer with Language-Specific Subnetworks for Low-Resource Dependency Parsing. Computational Linguistics:613–641.
Cite (Informal):
Cross-Lingual Transfer with Language-Specific Subnetworks for Low-Resource Dependency Parsing (Choenni et al., CL 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-1/2023.cl-3.3.pdf
Video:
 https://preview.aclanthology.org/nschneid-patch-1/2023.cl-3.3.mp4