TriLLaMa at CQs-Gen 2025: A Two-Stage LLM-Based System for Critical Question Generation

Frieso Turkstra, Sara Nabhani, Khalid Al-Khatib


Abstract
This paper presents a new system for generating critical questions in debates, developed for the Critical Questions Generation shared task. Our two-stage approach, combining generation and classification, utilizes LLaMA 3.1 Instruct models (8B, 70B, 405B) with zero-/few-shot prompting. Evaluations on annotated debate data reveal several key insights: few-shot generation with 405B yielded relatively high-quality questions, achieving a maximum possible punctuation score of 73.5. The 70B model outperformed both smaller and larger variants on the classification part. The classifiers showed a strong bias toward labeling generated questions as Useful, despite limited validation. Further, our system, ranked 6 extsuperscriptth, out-performed baselines by 3%. These findings stress the effectiveness of large-sized models for question generation and medium-sized models for classification, and suggest the need for clearer task definitions within prompts to improve classification accuracy.
Anthology ID:
2025.argmining-1.34
Volume:
Proceedings of the 12th Argument mining Workshop
Month:
July
Year:
2025
Address:
Vienna, Austria
Editors:
Elena Chistova, Philipp Cimiano, Shohreh Haddadan, Gabriella Lapesa, Ramon Ruiz-Dolz
Venues:
ArgMining | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
349–357
Language:
URL:
https://preview.aclanthology.org/mtsummit-25-ingestion/2025.argmining-1.34/
DOI:
10.18653/v1/2025.argmining-1.34
Bibkey:
Cite (ACL):
Frieso Turkstra, Sara Nabhani, and Khalid Al-Khatib. 2025. TriLLaMa at CQs-Gen 2025: A Two-Stage LLM-Based System for Critical Question Generation. In Proceedings of the 12th Argument mining Workshop, pages 349–357, Vienna, Austria. Association for Computational Linguistics.
Cite (Informal):
TriLLaMa at CQs-Gen 2025: A Two-Stage LLM-Based System for Critical Question Generation (Turkstra et al., ArgMining 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/mtsummit-25-ingestion/2025.argmining-1.34.pdf