Hierarchical Enhancement Framework for Aspect-based Argument Mining

Yujie Fu, Yang Li, Suge Wang, Xiaoli Li, Deyu Li, Jian Liao, JianXing Zheng


Abstract
Aspect-Based Argument Mining (ABAM) is a critical task in computational argumentation. Existing methods have primarily treated ABAM as a nested named entity recognition problem, overlooking the need for tailored strategies to effectively address the specific challenges of ABAM tasks. To this end, we propose a layer-based Hierarchical Enhancement Framework (HEF) for ABAM, and introduce three novel components: the Semantic and Syntactic Fusion (SSF) component, the Batch-level Heterogeneous Graph Attention Network (BHGAT) component, and the Span Mask Interactive Attention (SMIA) component. These components serve the purposes of optimizing underlying representations, detecting argument unit stances, and constraining aspect term recognition boundaries, respectively. By incorporating these components, our framework enables better handling of the challenges and improves the performance and accuracy in argument unit and aspect term recognition. Experiments on multiple datasets and various tasks verify the effectiveness of the proposed framework and components.
Anthology ID:
2023.findings-emnlp.99
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1423–1433
Language:
URL:
https://aclanthology.org/2023.findings-emnlp.99
DOI:
10.18653/v1/2023.findings-emnlp.99
Bibkey:
Cite (ACL):
Yujie Fu, Yang Li, Suge Wang, Xiaoli Li, Deyu Li, Jian Liao, and JianXing Zheng. 2023. Hierarchical Enhancement Framework for Aspect-based Argument Mining. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 1423–1433, Singapore. Association for Computational Linguistics.
Cite (Informal):
Hierarchical Enhancement Framework for Aspect-based Argument Mining (Fu et al., Findings 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/dois-2013-emnlp/2023.findings-emnlp.99.pdf