A Simple yet Effective Joint Training Method for Cross-Lingual Universal Dependency Parsing

Danlu Chen, Mengxiao Lin, Zhifeng Hu, Xipeng Qiu


Abstract
This paper describes Fudan’s submission to CoNLL 2018’s shared task Universal Dependency Parsing. We jointly train models when two languages are similar according to linguistic typology and then ensemble the models using a simple re-parse algorithm. We outperform the baseline method by 4.4% (2.1%) on average on development (test) set in CoNLL 2018 UD Shared Task.
Anthology ID:
K18-2026
Volume:
Proceedings of the CoNLL 2018 Shared Task: Multilingual Parsing from Raw Text to Universal Dependencies
Month:
October
Year:
2018
Address:
Brussels, Belgium
Editors:
Daniel Zeman, Jan Hajič
Venue:
CoNLL
SIG:
SIGNLL
Publisher:
Association for Computational Linguistics
Note:
Pages:
256–263
Language:
URL:
https://aclanthology.org/K18-2026
DOI:
10.18653/v1/K18-2026
Bibkey:
Cite (ACL):
Danlu Chen, Mengxiao Lin, Zhifeng Hu, and Xipeng Qiu. 2018. A Simple yet Effective Joint Training Method for Cross-Lingual Universal Dependency Parsing. In Proceedings of the CoNLL 2018 Shared Task: Multilingual Parsing from Raw Text to Universal Dependencies, pages 256–263, Brussels, Belgium. Association for Computational Linguistics.
Cite (Informal):
A Simple yet Effective Joint Training Method for Cross-Lingual Universal Dependency Parsing (Chen et al., CoNLL 2018)
Copy Citation:
PDF:
https://preview.aclanthology.org/improve-issue-templates/K18-2026.pdf