Abstract
Word embeddings have been widely used in sentiment classification because of their efficacy for semantic representations of words. Given reviews from different domains, some existing methods for word embeddings exploit sentiment information, but they cannot produce domain-sensitive embeddings. On the other hand, some other existing methods can generate domain-sensitive word embeddings, but they cannot distinguish words with similar contexts but opposite sentiment polarity. We propose a new method for learning domain-sensitive and sentiment-aware embeddings that simultaneously capture the information of sentiment semantics and domain sensitivity of individual words. Our method can automatically determine and produce domain-common embeddings and domain-specific embeddings. The differentiation of domain-common and domain-specific words enables the advantage of data augmentation of common semantics from multiple domains and capture the varied semantics of specific words from different domains at the same time. Experimental results show that our model provides an effective way to learn domain-sensitive and sentiment-aware word embeddings which benefit sentiment classification at both sentence level and lexicon term level.- Anthology ID:
- P18-1232
- Volume:
- Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
- Month:
- July
- Year:
- 2018
- Address:
- Melbourne, Australia
- Venue:
- ACL
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 2494–2504
- Language:
- URL:
- https://aclanthology.org/P18-1232
- DOI:
- 10.18653/v1/P18-1232
- Cite (ACL):
- Bei Shi, Zihao Fu, Lidong Bing, and Wai Lam. 2018. Learning Domain-Sensitive and Sentiment-Aware Word Embeddings. In Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 2494–2504, Melbourne, Australia. Association for Computational Linguistics.
- Cite (Informal):
- Learning Domain-Sensitive and Sentiment-Aware Word Embeddings (Shi et al., ACL 2018)
- PDF:
- https://preview.aclanthology.org/remove-xml-comments/P18-1232.pdf