Discourse-Aware In-Context Learning for Temporal Expression Normalization

Akash Gautam, Lukas Lange, Jannik Strötgen


Abstract
Temporal expression (TE) normalization is a well-studied problem. However, the predominately used rule-based systems are highly restricted to specific settings, and upcoming machine learning approaches suffer from a lack of labeled data. In this work, we explore the feasibility of proprietary and open-source large language models (LLMs) for TE normalization using in-context learning to inject task, document, and example information into the model. We explore various sample selection strategies to retrieve the most relevant set of examples. By using a window-based prompt design approach, we can perform TE normalization across sentences, while leveraging the LLM knowledge without training the model.Our experiments show competitive results to models designed for this task. In particular, our method achieves large performance improvements for non-standard settings by dynamically including relevant examples during inference.
Anthology ID:
2024.naacl-short.27
Volume:
Proceedings of the 2024 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 2: Short Papers)
Month:
June
Year:
2024
Address:
Mexico City, Mexico
Editors:
Kevin Duh, Helena Gomez, Steven Bethard
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
306–315
Language:
URL:
https://aclanthology.org/2024.naacl-short.27
DOI:
Bibkey:
Cite (ACL):
Akash Gautam, Lukas Lange, and Jannik Strötgen. 2024. Discourse-Aware In-Context Learning for Temporal Expression Normalization. In Proceedings of the 2024 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 2: Short Papers), pages 306–315, Mexico City, Mexico. Association for Computational Linguistics.
Cite (Informal):
Discourse-Aware In-Context Learning for Temporal Expression Normalization (Gautam et al., NAACL 2024)
Copy Citation:
PDF:
https://preview.aclanthology.org/bionlp-24-ingestion/2024.naacl-short.27.pdf