Transformer-Based Temporal Information Extraction and Application: A Review

Xin Su, Phillip Howard, Steven Bethard


Abstract
Temporal information extraction (IE) aims to extract structured temporal information from unstructured text, thereby uncovering the implicit timelines within. This technique is applied across domains such as healthcare, newswire, and intelligence analysis, aiding models in these areas to perform temporal reasoning and enabling human users to grasp the temporal structure of text. Transformer-based pre-trained language models have produced revolutionary advancements in natural language processing, demonstrating exceptional performance across a multitude of tasks. Despite the achievements garnered by Transformer-based approaches in temporal IE, there is a lack of comprehensive reviews on these endeavors. In this paper, we aim to bridge this gap by systematically summarizing and analyzing the body of work on temporal IE using Transformers while highlighting potential future research directions.
Anthology ID:
2025.emnlp-main.1467
Volume:
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
28810–28829
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.1467/
DOI:
Bibkey:
Cite (ACL):
Xin Su, Phillip Howard, and Steven Bethard. 2025. Transformer-Based Temporal Information Extraction and Application: A Review. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing, pages 28810–28829, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
Transformer-Based Temporal Information Extraction and Application: A Review (Su et al., EMNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.1467.pdf
Checklist:
 2025.emnlp-main.1467.checklist.pdf