Constituency Tree Representation for Argument Unit Recognition

Samuel Guilluy, Florian Mehats, Billal Chouli


Abstract
The conventional method of extracting arguments from sentences solely relies on word proximity, disregarding the syntactic structure of the sentence. This approach often leads to inaccuracies, especially when identifying argumentative span boundaries. In this research, we investigate the benefits of utilizing a constituency tree representation of sentences to predict Argument Discourse Units (ADUs) at the token level. We first evaluate the effectiveness of utilizing the constituency tree representation for capturing the structural attributes of arguments within sentences. We demonstrate empirically that the constituency structure surpasses simple linear dependencies among neighboring words in terms of effectiveness. Our approach involves leveraging graph neural networks in conjunction with the constituency tree, adapting it specifically for argument unit recognition. Through extensive evaluation, our model outperforms existing approaches in recognizing argument units at the token level. Furthermore, we employ explainability methods to assess the suitability of our model architecture, providing insights into its performance.
Anthology ID:
2023.argmining-1.4
Volume:
Proceedings of the 10th Workshop on Argument Mining
Month:
December
Year:
2023
Address:
Singapore
Editors:
Milad Alshomary, Chung-Chi Chen, Smaranda Muresan, Joonsuk Park, Julia Romberg
Venues:
ArgMining | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
35–44
Language:
URL:
https://aclanthology.org/2023.argmining-1.4
DOI:
10.18653/v1/2023.argmining-1.4
Bibkey:
Cite (ACL):
Samuel Guilluy, Florian Mehats, and Billal Chouli. 2023. Constituency Tree Representation for Argument Unit Recognition. In Proceedings of the 10th Workshop on Argument Mining, pages 35–44, Singapore. Association for Computational Linguistics.
Cite (Informal):
Constituency Tree Representation for Argument Unit Recognition (Guilluy et al., ArgMining-WS 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/naacl24-info/2023.argmining-1.4.pdf