Event Temporal Relation Extraction with Bayesian Translational Model

Xingwei Tan, Gabriele Pergola, Yulan He


Abstract
Existing models to extract temporal relations between events lack a principled method to incorporate external knowledge. In this study, we introduce Bayesian-Trans, a Bayesian learning-based method that models the temporal relation representations as latent variables and infers their values via Bayesian inference and translational functions. Compared to conventional neural approaches, instead of performing point estimation to find the best set parameters, the proposed model infers the parameters’ posterior distribution directly, enhancing the model’s capability to encode and express uncertainty about the predictions. Experimental results on the three widely used datasets show that Bayesian-Trans outperforms existing approaches for event temporal relation extraction. We additionally present detailed analyses on uncertainty quantification, comparison of priors, and ablation studies, illustrating the benefits of the proposed approach.
Anthology ID:
2023.eacl-main.80
Volume:
Proceedings of the 17th Conference of the European Chapter of the Association for Computational Linguistics
Month:
May
Year:
2023
Address:
Dubrovnik, Croatia
Editors:
Andreas Vlachos, Isabelle Augenstein
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1125–1138
Language:
URL:
https://aclanthology.org/2023.eacl-main.80
DOI:
10.18653/v1/2023.eacl-main.80
Bibkey:
Cite (ACL):
Xingwei Tan, Gabriele Pergola, and Yulan He. 2023. Event Temporal Relation Extraction with Bayesian Translational Model. In Proceedings of the 17th Conference of the European Chapter of the Association for Computational Linguistics, pages 1125–1138, Dubrovnik, Croatia. Association for Computational Linguistics.
Cite (Informal):
Event Temporal Relation Extraction with Bayesian Translational Model (Tan et al., EACL 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-4/2023.eacl-main.80.pdf
Video:
 https://preview.aclanthology.org/nschneid-patch-4/2023.eacl-main.80.mp4