Abstract
Domain adaptation is an important and widely studied problem in natural language processing. A large body of literature tries to solve this problem by adapting models trained on the source domain to the target domain. In this paper, we instead solve this problem from a dataset perspective. We modify the source domain dataset with simple lexical transformations to reduce the domain shift between the source dataset distribution and the target dataset distribution. We find that models trained on the transformed source domain dataset performs significantly better than zero-shot models. Using our proposed transformations to convert standard English to tweets, we reach an unsupervised part-of-speech (POS) tagging accuracy of 92.14% (from 81.54% zero shot accuracy), which is only slightly below the supervised performance of 94.45%. We also use our proposed transformations to synthetically generate tweets and augment the Twitter dataset to achieve state-of-the-art performance for POS tagging.- Anthology ID:
- 2023.wassa-1.17
- Volume:
- Proceedings of the 13th Workshop on Computational Approaches to Subjectivity, Sentiment, & Social Media Analysis
- Month:
- July
- Year:
- 2023
- Address:
- Toronto, Canada
- Venue:
- WASSA
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 184–193
- Language:
- URL:
- https://aclanthology.org/2023.wassa-1.17
- DOI:
- Cite (ACL):
- Akshat Gupta, Xiaomo Liu, and Sameena Shah. 2023. Unsupervised Domain Adaptation using Lexical Transformations and Label Injection for Twitter Data. In Proceedings of the 13th Workshop on Computational Approaches to Subjectivity, Sentiment, & Social Media Analysis, pages 184–193, Toronto, Canada. Association for Computational Linguistics.
- Cite (Informal):
- Unsupervised Domain Adaptation using Lexical Transformations and Label Injection for Twitter Data (Gupta et al., WASSA 2023)
- PDF:
- https://preview.aclanthology.org/paclic-22-ingestion/2023.wassa-1.17.pdf