Cross-Domain NER using Cross-Domain Language Modeling

Chen Jia, Xiaobo Liang, Yue Zhang


Abstract
Due to limitation of labeled resources, cross-domain named entity recognition (NER) has been a challenging task. Most existing work considers a supervised setting, making use of labeled data for both the source and target domains. A disadvantage of such methods is that they cannot train for domains without NER data. To address this issue, we consider using cross-domain LM as a bridge cross-domains for NER domain adaptation, performing cross-domain and cross-task knowledge transfer by designing a novel parameter generation network. Results show that our method can effectively extract domain differences from cross-domain LM contrast, allowing unsupervised domain adaptation while also giving state-of-the-art results among supervised domain adaptation methods.
Anthology ID:
P19-1236
Volume:
Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2019
Address:
Florence, Italy
Editors:
Anna Korhonen, David Traum, Lluís Màrquez
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2464–2474
Language:
URL:
https://aclanthology.org/P19-1236
DOI:
10.18653/v1/P19-1236
Bibkey:
Cite (ACL):
Chen Jia, Xiaobo Liang, and Yue Zhang. 2019. Cross-Domain NER using Cross-Domain Language Modeling. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pages 2464–2474, Florence, Italy. Association for Computational Linguistics.
Cite (Informal):
Cross-Domain NER using Cross-Domain Language Modeling (Jia et al., ACL 2019)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-bitext-workshop/P19-1236.pdf
Code
 jiachenwestlake/Cross-Domain_NER
Data
CoNLL 2003