EA2E: Improving Consistency with Event Awareness for Document-Level Argument Extraction

Qi Zeng, Qiusi Zhan, Heng Ji


Abstract
Events are inter-related in documents. Motivated by the one-sense-per-discourse theory, we hypothesize that a participant tends to play consistent roles across multiple events in the same document. However recent work on document-level event argument extraction models each individual event in isolation and therefore causes inconsistency among extracted arguments across events, which will further cause discrepancy for downstream applications such as event knowledge base population, question answering, and hypothesis generation. In this work, we formulate event argument consistency as the constraints from event-event relations under the document-level setting. To improve consistency we introduce the Event-Aware Argument Extraction (EA2E) model with augmented context for training and inference. Experiment results on WIKIEVENTS and ACE2005 datasets demonstrate the effectiveness of EA2E compared to baseline methods.
Anthology ID:
2022.findings-naacl.202
Volume:
Findings of the Association for Computational Linguistics: NAACL 2022
Month:
July
Year:
2022
Address:
Seattle, United States
Editors:
Marine Carpuat, Marie-Catherine de Marneffe, Ivan Vladimir Meza Ruiz
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2649–2655
Language:
URL:
https://aclanthology.org/2022.findings-naacl.202
DOI:
10.18653/v1/2022.findings-naacl.202
Bibkey:
Cite (ACL):
Qi Zeng, Qiusi Zhan, and Heng Ji. 2022. EA2E: Improving Consistency with Event Awareness for Document-Level Argument Extraction. In Findings of the Association for Computational Linguistics: NAACL 2022, pages 2649–2655, Seattle, United States. Association for Computational Linguistics.
Cite (Informal):
EA2E: Improving Consistency with Event Awareness for Document-Level Argument Extraction (Zeng et al., Findings 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/dois-2013-emnlp/2022.findings-naacl.202.pdf
Software:
 2022.findings-naacl.202.software.zip
Video:
 https://preview.aclanthology.org/dois-2013-emnlp/2022.findings-naacl.202.mp4
Code
 zqs1943/docie