Graph Transformer Networks with Syntactic and Semantic Structures for Event Argument Extraction

Amir Pouran Ben Veyseh, Tuan Ngo Nguyen, Thien Huu Nguyen


Abstract
The goal of Event Argument Extraction (EAE) is to find the role of each entity mention for a given event trigger word. It has been shown in the previous works that the syntactic structures of the sentences are helpful for the deep learning models for EAE. However, a major problem in such prior works is that they fail to exploit the semantic structures of the sentences to induce effective representations for EAE. Consequently, in this work, we propose a novel model for EAE that exploits both syntactic and semantic structures of the sentences with the Graph Transformer Networks (GTNs) to learn more effective sentence structures for EAE. In addition, we introduce a novel inductive bias based on information bottleneck to improve generalization of the EAE models. Extensive experiments are performed to demonstrate the benefits of the proposed model, leading to state-of-the-art performance for EAE on standard datasets.
Anthology ID:
2020.findings-emnlp.326
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2020
Month:
November
Year:
2020
Address:
Online
Editors:
Trevor Cohn, Yulan He, Yang Liu
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3651–3661
Language:
URL:
https://aclanthology.org/2020.findings-emnlp.326
DOI:
10.18653/v1/2020.findings-emnlp.326
Bibkey:
Cite (ACL):
Amir Pouran Ben Veyseh, Tuan Ngo Nguyen, and Thien Huu Nguyen. 2020. Graph Transformer Networks with Syntactic and Semantic Structures for Event Argument Extraction. In Findings of the Association for Computational Linguistics: EMNLP 2020, pages 3651–3661, Online. Association for Computational Linguistics.
Cite (Informal):
Graph Transformer Networks with Syntactic and Semantic Structures for Event Argument Extraction (Pouran Ben Veyseh et al., Findings 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/add_acl24_videos/2020.findings-emnlp.326.pdf