LAILab at Chemotimelines 2024: Finetuning sequence-to-sequence language models for temporal relation extraction towards cancer patient undergoing chemotherapy treatment

Shohreh Haddadan, Tuan-Dung Le, Thanh Duong, Thanh Thieu


Abstract
In this paper, we report our effort to tackle the challenge of extracting chemotimelines from EHR notes across a dataset of three cancer types. We focus on the two subtasks: 1) detection and classification of temporal relations given the annotated chemotherapy events and time expressions and 2) directly extracting patient chemotherapy timelines from EHR notes. We address both subtasks using Large Language Models. Our best-performing methods in both subtasks use Flan-T5, an instruction-tuned language model. Our proposed system achieves the highest average score in both subtasks. Our results underscore the effectiveness of finetuning general-domain large language models in domain-specific and unseen tasks.
Anthology ID:
2024.clinicalnlp-1.37
Volume:
Proceedings of the 6th Clinical Natural Language Processing Workshop
Month:
June
Year:
2024
Address:
Mexico City, Mexico
Editors:
Tristan Naumann, Asma Ben Abacha, Steven Bethard, Kirk Roberts, Danielle Bitterman
Venues:
ClinicalNLP | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
382–393
Language:
URL:
https://aclanthology.org/2024.clinicalnlp-1.37
DOI:
10.18653/v1/2024.clinicalnlp-1.37
Bibkey:
Cite (ACL):
Shohreh Haddadan, Tuan-Dung Le, Thanh Duong, and Thanh Thieu. 2024. LAILab at Chemotimelines 2024: Finetuning sequence-to-sequence language models for temporal relation extraction towards cancer patient undergoing chemotherapy treatment. In Proceedings of the 6th Clinical Natural Language Processing Workshop, pages 382–393, Mexico City, Mexico. Association for Computational Linguistics.
Cite (Informal):
LAILab at Chemotimelines 2024: Finetuning sequence-to-sequence language models for temporal relation extraction towards cancer patient undergoing chemotherapy treatment (Haddadan et al., ClinicalNLP-WS 2024)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-4/2024.clinicalnlp-1.37.pdf