Data Augmentation for Cross-Domain Named Entity Recognition
Shuguang Chen, Gustavo Aguilar, Leonardo Neves, Thamar Solorio
Abstract
Current work in named entity recognition (NER) shows that data augmentation techniques can produce more robust models. However, most existing techniques focus on augmenting in-domain data in low-resource scenarios where annotated data is quite limited. In this work, we take this research direction to the opposite and study cross-domain data augmentation for the NER task. We investigate the possibility of leveraging data from high-resource domains by projecting it into the low-resource domains. Specifically, we propose a novel neural architecture to transform the data representation from a high-resource to a low-resource domain by learning the patterns (e.g. style, noise, abbreviations, etc.) in the text that differentiate them and a shared feature space where both domains are aligned. We experiment with diverse datasets and show that transforming the data to the low-resource domain representation achieves significant improvements over only using data from high-resource domains.- Anthology ID:
- 2021.emnlp-main.434
- Volume:
- Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
- Month:
- November
- Year:
- 2021
- Address:
- Online and Punta Cana, Dominican Republic
- Venue:
- EMNLP
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 5346–5356
- Language:
- URL:
- https://aclanthology.org/2021.emnlp-main.434
- DOI:
- 10.18653/v1/2021.emnlp-main.434
- Cite (ACL):
- Shuguang Chen, Gustavo Aguilar, Leonardo Neves, and Thamar Solorio. 2021. Data Augmentation for Cross-Domain Named Entity Recognition. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 5346–5356, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
- Cite (Informal):
- Data Augmentation for Cross-Domain Named Entity Recognition (Chen et al., EMNLP 2021)
- PDF:
- https://preview.aclanthology.org/ingestion-script-update/2021.emnlp-main.434.pdf
- Code
- ritual-uh/style_ner