ConNER: Consistency Training for Cross-lingual Named Entity Recognition

Ran Zhou, Xin Li, Lidong Bing, Erik Cambria, Luo Si, Chunyan Miao


Abstract
Cross-lingual named entity recognition (NER) suffers from data scarcity in the target languages, especially under zero-shot settings. Existing translate-train or knowledge distillation methods attempt to bridge the language gap, but often introduce a high level of noise. To solve this problem, consistency training methods regularize the model to be robust towards perturbations on data or hidden states.However, such methods are likely to violate the consistency hypothesis, or mainly focus on coarse-grain consistency.We propose ConNER as a novel consistency training framework for cross-lingual NER, which comprises of: (1) translation-based consistency training on unlabeled target-language data, and (2) dropout-based consistency training on labeled source-language data. ConNER effectively leverages unlabeled target-language data and alleviates overfitting on the source language to enhance the cross-lingual adaptability. Experimental results show our ConNER achieves consistent improvement over various baseline methods.
Anthology ID:
2022.emnlp-main.577
Volume:
Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
8438–8449
Language:
URL:
https://aclanthology.org/2022.emnlp-main.577
DOI:
10.18653/v1/2022.emnlp-main.577
Bibkey:
Cite (ACL):
Ran Zhou, Xin Li, Lidong Bing, Erik Cambria, Luo Si, and Chunyan Miao. 2022. ConNER: Consistency Training for Cross-lingual Named Entity Recognition. In Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, pages 8438–8449, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
Cite (Informal):
ConNER: Consistency Training for Cross-lingual Named Entity Recognition (Zhou et al., EMNLP 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/remove-xml-comments/2022.emnlp-main.577.pdf