Abstract
Entailment Graphs (EGs) have been constructed based on extracted corpora as a strong and explainable form to indicate context-independent entailment relation in natural languages. However, EGs built by previous methods often suffer from the severe sparsity issues, due to limited corpora available and the long-tail phenomenon of predicate distributions. In this paper, we propose a multi-stage method, Typed Predicate-Entailment Graph Generator (TP-EGG), to tackle this problem. Given several seed predicates, TP-EGG builds the graphs by generating new predicates and detecting entailment relations among them. The generative nature of TP-EGG helps us leverage the recent advances from large pretrained language models (PLMs), while avoiding the reliance on carefully prepared corpora. Experiments on benchmark datasets show that TP-EGG can generate high-quality and scale-controllable entailment graphs, achieving significant in-domain improvement over state-of-the-art EGs and boosting the performance of down-stream inference tasks.- Anthology ID:
- 2023.acl-long.196
- Volume:
- Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
- Month:
- July
- Year:
- 2023
- Address:
- Toronto, Canada
- Editors:
- Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
- Venue:
- ACL
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 3534–3551
- Language:
- URL:
- https://aclanthology.org/2023.acl-long.196
- DOI:
- 10.18653/v1/2023.acl-long.196
- Cite (ACL):
- Zhibin Chen, Yansong Feng, and Dongyan Zhao. 2023. From the One, Judge of the Whole: Typed Entailment Graph Construction with Predicate Generation. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 3534–3551, Toronto, Canada. Association for Computational Linguistics.
- Cite (Informal):
- From the One, Judge of the Whole: Typed Entailment Graph Construction with Predicate Generation (Chen et al., ACL 2023)
- PDF:
- https://preview.aclanthology.org/nschneid-patch-5/2023.acl-long.196.pdf