Abstract
Linear embedding transformation has been shown to be effective for zero-shot cross-lingual transfer tasks and achieve surprisingly promising results. However, cross-lingual embedding space mapping is usually studied in static word-level embeddings, where a space transformation is derived by aligning representations of translation pairs that are referred from dictionaries. We move further from this line and investigate a contextual embedding alignment approach which is sense-level and dictionary-free. To enhance the quality of the mapping, we also provide a deep view of properties of contextual embeddings, i.e., the anisotropy problem and its solution. Experiments on zero-shot dependency parsing through the concept-shared space built by our embedding transformation substantially outperform state-of-the-art methods using multilingual embeddings.- Anthology ID:
 - 2021.adaptnlp-1.21
 - Volume:
 - Proceedings of the Second Workshop on Domain Adaptation for NLP
 - Month:
 - April
 - Year:
 - 2021
 - Address:
 - Kyiv, Ukraine
 - Venue:
 - AdaptNLP
 - SIG:
 - Publisher:
 - Association for Computational Linguistics
 - Note:
 - Pages:
 - 204–213
 - Language:
 - URL:
 - https://aclanthology.org/2021.adaptnlp-1.21
 - DOI:
 - Cite (ACL):
 - Haoran Xu and Philipp Koehn. 2021. Zero-Shot Cross-Lingual Dependency Parsing through Contextual Embedding Transformation. In Proceedings of the Second Workshop on Domain Adaptation for NLP, pages 204–213, Kyiv, Ukraine. Association for Computational Linguistics.
 - Cite (Informal):
 - Zero-Shot Cross-Lingual Dependency Parsing through Contextual Embedding Transformation (Xu & Koehn, AdaptNLP 2021)
 - PDF:
 - https://preview.aclanthology.org/ingestion-script-update/2021.adaptnlp-1.21.pdf
 - Code
 - fe1ixxu/ZeroShot-CrossLing-Parsing