Towards Benchmarking and Improving the Temporal Reasoning Capability of Large Language Models

Qingyu Tan, Hwee Tou Ng, Lidong Bing


Abstract
Reasoning about time is of fundamental importance. Many facts are time-dependent. For example, athletes change teams from time to time, and different government officials are elected periodically. Previous time-dependent question answering (QA) datasets tend to be biased in either their coverage of time spans or question types. In this paper, we introduce a comprehensive probing dataset TempReason to evaluate the temporal reasoning capability of large language models. Our dataset includes questions of three temporal reasoning levels. In addition, we also propose a novel learning framework to improve the temporal reasoning capability of large language models, based on temporal span extraction and time-sensitive reinforcement learning. We conducted experiments in closed book QA, open book QA, and reasoning QA settings and demonstrated the effectiveness of our approach.
Anthology ID:
2023.acl-long.828
Volume:
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
14820–14835
Language:
URL:
https://aclanthology.org/2023.acl-long.828
DOI:
10.18653/v1/2023.acl-long.828
Bibkey:
Cite (ACL):
Qingyu Tan, Hwee Tou Ng, and Lidong Bing. 2023. Towards Benchmarking and Improving the Temporal Reasoning Capability of Large Language Models. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 14820–14835, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Towards Benchmarking and Improving the Temporal Reasoning Capability of Large Language Models (Tan et al., ACL 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/improve-issue-templates/2023.acl-long.828.pdf
Video:
 https://preview.aclanthology.org/improve-issue-templates/2023.acl-long.828.mp4