@inproceedings{su-etal-2025-transformer,
    title = "Transformer-Based Temporal Information Extraction and Application: A Review",
    author = "Su, Xin  and
      Howard, Phillip  and
      Bethard, Steven",
    editor = "Christodoulopoulos, Christos  and
      Chakraborty, Tanmoy  and
      Rose, Carolyn  and
      Peng, Violet",
    booktitle = "Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing",
    month = nov,
    year = "2025",
    address = "Suzhou, China",
    publisher = "Association for Computational Linguistics",
    url = "https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.1467/",
    pages = "28810--28829",
    ISBN = "979-8-89176-332-6",
    abstract = "Temporal information extraction (IE) aims to extract structured temporal information from unstructured text, thereby uncovering the implicit timelines within. This technique is applied across domains such as healthcare, newswire, and intelligence analysis, aiding models in these areas to perform temporal reasoning and enabling human users to grasp the temporal structure of text. Transformer-based pre-trained language models have produced revolutionary advancements in natural language processing, demonstrating exceptional performance across a multitude of tasks. Despite the achievements garnered by Transformer-based approaches in temporal IE, there is a lack of comprehensive reviews on these endeavors. In this paper, we aim to bridge this gap by systematically summarizing and analyzing the body of work on temporal IE using Transformers while highlighting potential future research directions."
}Markdown (Informal)
[Transformer-Based Temporal Information Extraction and Application: A Review](https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.1467/) (Su et al., EMNLP 2025)
ACL