Abstract
Answering time-sensitive questions from long documents requires temporal reasoning over the times in questions and documents. An important open question is whether large language models can perform such reasoning solely using a provided text document, or whether they can benefit from additional temporal information extracted using other systems. We address this research question by applying existing temporal information extraction systems to construct temporal graphs of events, times, and temporal relations in questions and documents. We then investigate different approaches for fusing these graphs into Transformer models. Experimental results show that our proposed approach for fusing temporal graphs into input text substantially enhances the temporal reasoning capabilities of Transformer models with or without fine-tuning. Additionally, our proposed method outperforms various graph convolution-based approaches and establishes a new state-of-the-art performance on SituatedQA and three splits of TimeQA.- Anthology ID:
- 2023.findings-emnlp.67
- Volume:
- Findings of the Association for Computational Linguistics: EMNLP 2023
- Month:
- December
- Year:
- 2023
- Address:
- Singapore
- Editors:
- Houda Bouamor, Juan Pino, Kalika Bali
- Venue:
- Findings
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 948–966
- Language:
- URL:
- https://aclanthology.org/2023.findings-emnlp.67
- DOI:
- 10.18653/v1/2023.findings-emnlp.67
- Cite (ACL):
- Xin Su, Phillip Howard, Nagib Hakim, and Steven Bethard. 2023. Fusing Temporal Graphs into Transformers for Time-Sensitive Question Answering. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 948–966, Singapore. Association for Computational Linguistics.
- Cite (Informal):
- Fusing Temporal Graphs into Transformers for Time-Sensitive Question Answering (Su et al., Findings 2023)
- PDF:
- https://preview.aclanthology.org/nschneid-patch-2/2023.findings-emnlp.67.pdf