Abstract
Sentiment analysis is a task that highly depends on the understanding of word senses. Traditional neural network models are black boxes that represent word senses as vectors that are uninterpretable for humans. On the other hand, the application of Word Sense Disambiguation (WSD) systems in downstream tasks poses challenges regarding i) which words need to be disambiguated, and ii) how to model explicit word senses into easily understandable terms for a downstream model. This work proposes a neurosymbolic framework that incorporates WSD by identifying and paraphrasing ambiguous words to improve the accuracy of sentiment predictions. The framework allows us to understand which words are paraphrased into which semantically unequivocal words, thus enabling a downstream task model to gain both accuracy and interpretability. To better fine-tune a lexical substitution model for WSD on a downstream task without ground-truth word sense labels, we leverage dynamic rewarding to jointly train sentiment analysis and lexical substitution models. Our framework proves to effectively improve the performance of sentiment analysis on corpora from different domains.- Anthology ID:
- 2023.findings-emnlp.587
- Volume:
- Findings of the Association for Computational Linguistics: EMNLP 2023
- Month:
- December
- Year:
- 2023
- Address:
- Singapore
- Editors:
- Houda Bouamor, Juan Pino, Kalika Bali
- Venue:
- Findings
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 8772–8783
- Language:
- URL:
- https://aclanthology.org/2023.findings-emnlp.587
- DOI:
- 10.18653/v1/2023.findings-emnlp.587
- Cite (ACL):
- Xulang Zhang, Rui Mao, Kai He, and Erik Cambria. 2023. Neuro-Symbolic Sentiment Analysis with Dynamic Word Sense Disambiguation. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 8772–8783, Singapore. Association for Computational Linguistics.
- Cite (Informal):
- Neuro-Symbolic Sentiment Analysis with Dynamic Word Sense Disambiguation (Zhang et al., Findings 2023)
- PDF:
- https://preview.aclanthology.org/nschneid-patch-5/2023.findings-emnlp.587.pdf