Abstract
Event representation learning has been shown beneficial in various downstream tasks. Current event representation learning methods, which mainly focus on capturing the semantics of events via deterministic vector embeddings, have made notable progress. However, they ignore two important properties: the multiple relations between events and the uncertainty within events. In this paper, we propose a novel approach to learning multi-relational probabilistic event embeddings based on contrastive learning. Specifically, the proposed method consists of three major modules, a multi-relational event generation module to automatically generate multi-relational training data, a probabilistic event encoding module to model uncertainty of events by Gaussian density embeddings, and a relation-aware projection module to adapt unseen relations by projecting Gaussian embeddings into relation-aware subspaces. Moreover, a novel contrastive learning loss is elaborately designed for learning the multi-relational probabilistic embeddings. Since the existing benchmarks for event representation learning ignore relations and uncertainty of events, a novel dataset named MRPES is constructed to investigate whether multiple relations between events and uncertainty within events are learned. Experimental results show that the proposed approach outperforms other state-of-the-art baselines on both existing and newly constructed datasets.- Anthology ID:
- 2023.findings-acl.384
- Volume:
- Findings of the Association for Computational Linguistics: ACL 2023
- Month:
- July
- Year:
- 2023
- Address:
- Toronto, Canada
- Editors:
- Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
- Venue:
- Findings
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 6162–6174
- Language:
- URL:
- https://aclanthology.org/2023.findings-acl.384
- DOI:
- 10.18653/v1/2023.findings-acl.384
- Cite (ACL):
- Linhai Zhang, Congzhi Zhang, and Deyu Zhou. 2023. Multi-Relational Probabilistic Event Representation Learning via Projected Gaussian Embedding. In Findings of the Association for Computational Linguistics: ACL 2023, pages 6162–6174, Toronto, Canada. Association for Computational Linguistics.
- Cite (Informal):
- Multi-Relational Probabilistic Event Representation Learning via Projected Gaussian Embedding (Zhang et al., Findings 2023)
- PDF:
- https://preview.aclanthology.org/nschneid-patch-2/2023.findings-acl.384.pdf