Abstract
Sentiment analysis is a useful problem which could serve a variety of fields from business intelligence to social studies and even health studies. Using SemEval 2022 Task 10 formulation of this problem and taking sequence labeling as our approach, we propose a model which learns the task by finetuning a pretrained transformer, introducing as few parameters (~150k) as possible and making use of precomputed attention values in the transformer. Our model improves shared task baselines on all task datasets.- Anthology ID:
- 2022.semeval-1.192
- Volume:
- Proceedings of the 16th International Workshop on Semantic Evaluation (SemEval-2022)
- Month:
- July
- Year:
- 2022
- Address:
- Seattle, United States
- Venue:
- SemEval
- SIG:
- SIGLEX
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 1382–1388
- Language:
- URL:
- https://aclanthology.org/2022.semeval-1.192
- DOI:
- 10.18653/v1/2022.semeval-1.192
- Cite (ACL):
- Sadrodin Barikbin. 2022. SLPL-Sentiment at SemEval-2022 Task 10: Making Use of Pre-Trained Model’s Attention Values in Structured Sentiment Analysis. In Proceedings of the 16th International Workshop on Semantic Evaluation (SemEval-2022), pages 1382–1388, Seattle, United States. Association for Computational Linguistics.
- Cite (Informal):
- SLPL-Sentiment at SemEval-2022 Task 10: Making Use of Pre-Trained Model’s Attention Values in Structured Sentiment Analysis (Barikbin, SemEval 2022)
- PDF:
- https://preview.aclanthology.org/remove-xml-comments/2022.semeval-1.192.pdf
- Data
- MPQA Opinion Corpus, MultiBooked, NoReC_fine