Fine-tuning BERT Models for Summarizing German Radiology Findings

Siting Liang, Klaus Kades, Matthias Fink, Peter Full, Tim Weber, Jens Kleesiek, Michael Strube, Klaus Maier-Hein


Abstract
Writing the conclusion section of radiology reports is essential for communicating the radiology findings and its assessment to physician in a condensed form. In this work, we employ a transformer-based Seq2Seq model for generating the conclusion section of German radiology reports. The model is initialized with the pretrained parameters of a German BERT model and fine-tuned in our downstream task on our domain data. We proposed two strategies to improve the factual correctness of the model. In the first method, next to the abstractive learning objective, we introduce an extraction learning objective to train the decoder in the model to both generate one summary sequence and extract the key findings from the source input. The second approach is to integrate the pointer mechanism into the transformer-based Seq2Seq model. The pointer network helps the Seq2Seq model to choose between generating tokens from the vocabulary or copying parts from the source input during generation. The results of the automatic and human evaluations show that the enhanced Seq2Seq model is capable of generating human-like radiology conclusions and that the improved models effectively reduce the factual errors in the generations despite the small amount of training data.
Anthology ID:
2022.clinicalnlp-1.4
Volume:
Proceedings of the 4th Clinical Natural Language Processing Workshop
Month:
July
Year:
2022
Address:
Seattle, WA
Editors:
Tristan Naumann, Steven Bethard, Kirk Roberts, Anna Rumshisky
Venue:
ClinicalNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
30–40
Language:
URL:
https://aclanthology.org/2022.clinicalnlp-1.4
DOI:
10.18653/v1/2022.clinicalnlp-1.4
Bibkey:
Cite (ACL):
Siting Liang, Klaus Kades, Matthias Fink, Peter Full, Tim Weber, Jens Kleesiek, Michael Strube, and Klaus Maier-Hein. 2022. Fine-tuning BERT Models for Summarizing German Radiology Findings. In Proceedings of the 4th Clinical Natural Language Processing Workshop, pages 30–40, Seattle, WA. Association for Computational Linguistics.
Cite (Informal):
Fine-tuning BERT Models for Summarizing German Radiology Findings (Liang et al., ClinicalNLP 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/emnlp-22-attachments/2022.clinicalnlp-1.4.pdf
Video:
 https://preview.aclanthology.org/emnlp-22-attachments/2022.clinicalnlp-1.4.mp4