Constructing Code-mixed Universal Dependency Forest for Unbiased Cross-lingual Relation Extraction

Hao Fei, Meishan Zhang, Min Zhang, Tat-Seng Chua


Abstract
Latest efforts on cross-lingual relation extraction (XRE) aggressively leverage the language-consistent structural features from the universal dependency (UD) resource, while they may largely suffer from biased transfer (e.g., either target-biased or source-biased) due to the inevitable linguistic disparity between languages. In this work, we investigate an unbiased UD- based XRE transfer by constructing a type of code-mixed UD forest. We first translate the sentence of the source language to the parallel target-side language, for both of which we parse the UD tree respectively. Then, we merge the source-/target-side UD structures as a unified code-mixed UD forest. With such forest features, the gaps of UD-based XRE between the training and predicting phases can be effectively closed. We conduct experiments on the ACE XRE benchmark datasets, where the results demonstrate that the proposed code-mixed UD forests help unbiased UD-based XRE transfer, with which we achieve significant XRE performance gains.
Anthology ID:
2023.findings-acl.599
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
9395–9408
Language:
URL:
https://aclanthology.org/2023.findings-acl.599
DOI:
10.18653/v1/2023.findings-acl.599
Bibkey:
Cite (ACL):
Hao Fei, Meishan Zhang, Min Zhang, and Tat-Seng Chua. 2023. Constructing Code-mixed Universal Dependency Forest for Unbiased Cross-lingual Relation Extraction. In Findings of the Association for Computational Linguistics: ACL 2023, pages 9395–9408, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Constructing Code-mixed Universal Dependency Forest for Unbiased Cross-lingual Relation Extraction (Fei et al., Findings 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/dois-2013-emnlp/2023.findings-acl.599.pdf