Speaker Attribution in German Parliamentary Debates with QLoRA-adapted Large Language Models

Tobias Bornheim, Niklas Grieger, Patrick Gustav Blaneck, Stephan Bialonski


Abstract
The growing body of political texts opens up new opportunities for rich insights into political dynamics and ideologies but also increases the workload for manual analysis. Automated speaker attribution, which detects who said what to whom in a speech event and is closely related to semantic role labeling, is an important processing step for computational text analysis. We study the potential of the large language model family Llama 2 to automate speaker attribution in German parliamentary debates from 2017-2021. We fine-tune Llama 2 with QLoRA, an efficient training strategy, and observe our approach to achieve competitive performance in the GermEval 2023 Shared Task On Speaker Attribution in German News Articles and Parliamentary Debates. Our results shed light on the capabilities of large language models in automating speaker attribution, revealing a promising avenue for computational analysis of political discourse and the development of semantic role labeling systems.
Anthology ID:
2024.jlcl-1.1
Volume:
Journal for Language Technology and Computational Linguistics, Vol. 37 No. 1
Month:
March.
Year:
2024
Address:
Germany
Editor:
Christian Wartena
Venue:
JLCL
SIG:
Publisher:
German Society for Computational Lingustics and Language Technology
Note:
Pages:
1–13
Language:
URL:
https://preview.aclanthology.org/ingestion-wsc-csdh-2025/2024.jlcl-1.1/
DOI:
10.21248/jlcl.37.2024.244
Bibkey:
Cite (ACL):
Tobias Bornheim, Niklas Grieger, Patrick Gustav Blaneck, and Stephan Bialonski. 2024. Speaker Attribution in German Parliamentary Debates with QLoRA-adapted Large Language Models. Journal for Language Technology and Computational Linguistics, 37(1):1–13.
Cite (Informal):
Speaker Attribution in German Parliamentary Debates with QLoRA-adapted Large Language Models (Bornheim et al., JLCL 2024)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-wsc-csdh-2025/2024.jlcl-1.1.pdf