Relation-Aware Collaborative Learning for Unified Aspect-Based Sentiment Analysis

Zhuang Chen, Tieyun Qian


Abstract
Aspect-based sentiment analysis (ABSA) involves three subtasks, i.e., aspect term extraction, opinion term extraction, and aspect-level sentiment classification. Most existing studies focused on one of these subtasks only. Several recent researches made successful attempts to solve the complete ABSA problem with a unified framework. However, the interactive relations among three subtasks are still under-exploited. We argue that such relations encode collaborative signals between different subtasks. For example, when the opinion term is “delicious”, the aspect term must be “food” rather than “place”. In order to fully exploit these relations, we propose a Relation-Aware Collaborative Learning (RACL) framework which allows the subtasks to work coordinately via the multi-task learning and relation propagation mechanisms in a stacked multi-layer network. Extensive experiments on three real-world datasets demonstrate that RACL significantly outperforms the state-of-the-art methods for the complete ABSA task.
Anthology ID:
2020.acl-main.340
Volume:
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2020
Address:
Online
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3685–3694
Language:
URL:
https://aclanthology.org/2020.acl-main.340
DOI:
10.18653/v1/2020.acl-main.340
Bibkey:
Cite (ACL):
Zhuang Chen and Tieyun Qian. 2020. Relation-Aware Collaborative Learning for Unified Aspect-Based Sentiment Analysis. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pages 3685–3694, Online. Association for Computational Linguistics.
Cite (Informal):
Relation-Aware Collaborative Learning for Unified Aspect-Based Sentiment Analysis (Chen & Qian, ACL 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/emnlp-22-attachments/2020.acl-main.340.pdf
Video:
 http://slideslive.com/38928854
Code
 NLPWM-WHU/RACL