Abstract
Open information extraction (OIE) is the task of extracting facts "(Subject, Relation, Object)” from natural language text. We propose several new methods for training neural OIE models in this paper. First, we propose a novel method for computing syntactically rich text embeddings using the structure of dependency trees. Second, we propose a new discriminative training approach to OIE in which tokens in the generated fact are classified as “real” or “fake”, i.e., those tokens that are in both the generated and gold tuples, and those that are only in the generated tuple but not in the gold tuple. We also address the issue of repetitive tokens in generated facts and improve the models’ ability to generate implicit facts. Our approach reduces repetitive tokens by a factor of 23%. Finally, we present paraphrased versions of the CaRB, OIE2016, and LSOIE datasets, and show that the models’ performance substantially improves when trained on augmented datasets. Our best model beats the SOTA of IMoJIE on the recent CaRB dataset, with an improvement of 39.63% in F1 score.- Anthology ID:
- 2022.emnlp-main.401
- Volume:
- Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing
- Month:
- December
- Year:
- 2022
- Address:
- Abu Dhabi, United Arab Emirates
- Editors:
- Yoav Goldberg, Zornitsa Kozareva, Yue Zhang
- Venue:
- EMNLP
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 5972–5987
- Language:
- URL:
- https://aclanthology.org/2022.emnlp-main.401
- DOI:
- 10.18653/v1/2022.emnlp-main.401
- Cite (ACL):
- Frank Mtumbuka and Thomas Lukasiewicz. 2022. Syntactically Rich Discriminative Training: An Effective Method for Open Information Extraction. In Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, pages 5972–5987, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
- Cite (Informal):
- Syntactically Rich Discriminative Training: An Effective Method for Open Information Extraction (Mtumbuka & Lukasiewicz, EMNLP 2022)
- PDF:
- https://preview.aclanthology.org/nschneid-patch-5/2022.emnlp-main.401.pdf