Webis at CQs-Gen 2025: Prompting and Reranking for Critical Questions

Midhun Kanadan, Johannes Kiesel, Maximilian Heinrich, Benno Stein


Abstract
This paper reports on the submission of team extitWebis to the Critical Question Generation shared task at the 12th Workshop on Argument Mining (ArgMining 2025). Our approach is a fully automated two-stage pipeline that first prompts a large language model (LLM) to generate candidate critical questions for a given argumentative intervention, and then reranks the generated questions as per a classifier’s confidence in their usefulness. For the generation stage, we tested zero-shot, few-shot, and chain-of-thought prompting strategies. For the reranking stage, we used a ModernBERT classifier that we fine-tuned on either the validation set or an augmented version. Among our submissions, the best-performing configuration achieved a test score of 0.57 and ranked 5th in the shared task. Submissions that use reranking consistently outperformed baseline submissions without reranking across all metrics. Our results demonstrate that combining openweight LLMs with reranking significantly improves the quality of the resulting critical questions.
Anthology ID:
2025.argmining-1.26
Volume:
Proceedings of the 12th Argument mining Workshop
Month:
July
Year:
2025
Address:
Vienna, Austria
Editors:
Elena Chistova, Philipp Cimiano, Shohreh Haddadan, Gabriella Lapesa, Ramon Ruiz-Dolz
Venues:
ArgMining | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
281–288
Language:
URL:
https://preview.aclanthology.org/landing_page/2025.argmining-1.26/
DOI:
10.18653/v1/2025.argmining-1.26
Bibkey:
Cite (ACL):
Midhun Kanadan, Johannes Kiesel, Maximilian Heinrich, and Benno Stein. 2025. Webis at CQs-Gen 2025: Prompting and Reranking for Critical Questions. In Proceedings of the 12th Argument mining Workshop, pages 281–288, Vienna, Austria. Association for Computational Linguistics.
Cite (Informal):
Webis at CQs-Gen 2025: Prompting and Reranking for Critical Questions (Kanadan et al., ArgMining 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/landing_page/2025.argmining-1.26.pdf