From Speculation Detection to Trustworthy Relational Tuples in Information Extraction

Kuicai Dong, Aixin Sun, Jung-jae Kim, Xiaoli Li


Abstract
Speculation detection is an important NLP task to identify text factuality. However, the extracted speculative information (e.g., speculative polarity, cue, and scope) lacks structure and poses challenges for direct utilization in downstream tasks. Open Information Extraction (OIE), on the other hand, extracts structured tuples as facts, without examining the certainty of these tuples. Bridging this gap between speculation detection and information extraction becomes imperative to generate structured speculative information and trustworthy relational tuples. Existing studies on speculation detection are defined at sentence level; but even if a sentence is determined to be speculative, not all factual tuples extracted from it are speculative. In this paper, we propose to study speculations in OIE tuples and determine whether a tuple is speculative. We formally define the research problem of tuple-level speculation detection. We then conduct detailed analysis on the LSOIE dataset which provides labels for speculative tuples. Lastly, we propose a baseline model SpecTup for this new research task.
Anthology ID:
2023.findings-emnlp.886
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
13287–13299
Language:
URL:
https://aclanthology.org/2023.findings-emnlp.886
DOI:
10.18653/v1/2023.findings-emnlp.886
Bibkey:
Cite (ACL):
Kuicai Dong, Aixin Sun, Jung-jae Kim, and Xiaoli Li. 2023. From Speculation Detection to Trustworthy Relational Tuples in Information Extraction. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 13287–13299, Singapore. Association for Computational Linguistics.
Cite (Informal):
From Speculation Detection to Trustworthy Relational Tuples in Information Extraction (Dong et al., Findings 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-acl-2023-videos/2023.findings-emnlp.886.pdf