Target-specified Sequence Labeling with Multi-head Self-attention for Target-oriented Opinion Words Extraction

Yuhao Feng, Yanghui Rao, Yuyao Tang, Ninghua Wang, He Liu


Abstract
Opinion target extraction and opinion term extraction are two fundamental tasks in Aspect Based Sentiment Analysis (ABSA). Many recent works on ABSA focus on Target-oriented Opinion Words (or Terms) Extraction (TOWE), which aims at extracting the corresponding opinion words for a given opinion target. TOWE can be further applied to Aspect-Opinion Pair Extraction (AOPE) which aims at extracting aspects (i.e., opinion targets) and opinion terms in pairs. In this paper, we propose Target-Specified sequence labeling with Multi-head Self-Attention (TSMSA) for TOWE, in which any pre-trained language model with multi-head self-attention can be integrated conveniently. As a case study, we also develop a Multi-Task structure named MT-TSMSA for AOPE by combining our TSMSA with an aspect and opinion term extraction module. Experimental results indicate that TSMSA outperforms the benchmark methods on TOWE significantly; meanwhile, the performance of MT-TSMSA is similar or even better than state-of-the-art AOPE baseline models.
Anthology ID:
2021.naacl-main.145
Volume:
Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies
Month:
June
Year:
2021
Address:
Online
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1805–1815
Language:
URL:
https://aclanthology.org/2021.naacl-main.145
DOI:
10.18653/v1/2021.naacl-main.145
Bibkey:
Cite (ACL):
Yuhao Feng, Yanghui Rao, Yuyao Tang, Ninghua Wang, and He Liu. 2021. Target-specified Sequence Labeling with Multi-head Self-attention for Target-oriented Opinion Words Extraction. In Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pages 1805–1815, Online. Association for Computational Linguistics.
Cite (Informal):
Target-specified Sequence Labeling with Multi-head Self-attention for Target-oriented Opinion Words Extraction (Feng et al., NAACL 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/auto-file-uploads/2021.naacl-main.145.pdf
Video:
 https://preview.aclanthology.org/auto-file-uploads/2021.naacl-main.145.mp4
Code
 fengyh3/TSMSA