USSA: A Unified Table Filling Scheme for Structured Sentiment Analysis

Zepeng Zhai, Hao Chen, Ruifan Li, Xiaojie Wang


Abstract
Most previous studies on Structured Sentiment Analysis (SSA) have cast it as a problem of bi-lexical dependency parsing, which cannot address issues of overlap and discontinuity simultaneously. In this paper, we propose a niche-targeting and effective solution. Our approach involves creating a novel bi-lexical dependency parsing graph, which is then converted to a unified 2D table-filling scheme, namely USSA. The proposed scheme resolves the kernel bottleneck of previous SSA methods by utilizing 13 different types of relations. In addition, to closely collaborate with the USSA scheme, we have developed a model that includes a proposed bi-axial attention module to effectively capture the correlations among relations in the rows and columns of the table. Extensive experimental results on benchmark datasets demonstrate the effectiveness and robustness of our proposed framework, outperforming state-of-the-art methods consistently.
Anthology ID:
2023.acl-long.802
Volume:
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
14340–14353
Language:
URL:
https://aclanthology.org/2023.acl-long.802
DOI:
10.18653/v1/2023.acl-long.802
Bibkey:
Cite (ACL):
Zepeng Zhai, Hao Chen, Ruifan Li, and Xiaojie Wang. 2023. USSA: A Unified Table Filling Scheme for Structured Sentiment Analysis. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 14340–14353, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
USSA: A Unified Table Filling Scheme for Structured Sentiment Analysis (Zhai et al., ACL 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/emnlp-22-attachments/2023.acl-long.802.pdf
Video:
 https://preview.aclanthology.org/emnlp-22-attachments/2023.acl-long.802.mp4