Abstract
This paper proposes a way to improve the performance of existing algorithms for text classification in domains with strong language semantics. A proposed domain adaptation layer learns weights to combine a generic and a domain specific (DS) word embedding into a domain adapted (DA) embedding. The DA word embeddings are then used as inputs to a generic encoder + classifier framework to perform a downstream task such as classification. This adaptation layer is particularly suited to data sets that are modest in size, and which are, therefore, not ideal candidates for (re)training a deep neural network architecture. Results on binary and multi-class classification tasks using popular encoder architectures, including current state-of-the-art methods (with and without the shallow adaptation layer) show the effectiveness of the proposed approach.- Anthology ID:
- D19-1557
- Volume:
- Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP)
- Month:
- November
- Year:
- 2019
- Address:
- Hong Kong, China
- Editors:
- Kentaro Inui, Jing Jiang, Vincent Ng, Xiaojun Wan
- Venues:
- EMNLP | IJCNLP
- SIG:
- SIGDAT
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 5549–5558
- Language:
- URL:
- https://aclanthology.org/D19-1557
- DOI:
- 10.18653/v1/D19-1557
- Cite (ACL):
- Prathusha K Sarma, Yingyu Liang, and William Sethares. 2019. Shallow Domain Adaptive Embeddings for Sentiment Analysis. In Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP), pages 5549–5558, Hong Kong, China. Association for Computational Linguistics.
- Cite (Informal):
- Shallow Domain Adaptive Embeddings for Sentiment Analysis (K Sarma et al., EMNLP-IJCNLP 2019)
- PDF:
- https://preview.aclanthology.org/emnlp22-frontmatter/D19-1557.pdf