2025
pdf
bib
abs
Adaptive Prompting: Ad-hoc Prompt Composition for Social Bias Detection
Maximilian Spliethöver
|
Tim Knebler
|
Fabian Fumagalli
|
Maximilian Muschalik
|
Barbara Hammer
|
Eyke Hüllermeier
|
Henning Wachsmuth
Proceedings of the 2025 Conference of the Nations of the Americas Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 1: Long Papers)
Recent advances on instruction fine-tuning have led to the development of various prompting techniques for large language models, such as explicit reasoning steps. However, the success of techniques depends on various parameters, such as the task, language model, and context provided. Finding an effective prompt is, therefore, often a trial-and-error process. Most existing approaches to automatic prompting aim to optimize individual techniques instead of compositions of techniques and their dependence on the input. To fill this gap, we propose an adaptive prompting approach that predicts the optimal prompt composition ad-hoc for a given input. We apply our approach to social bias detection, a highly context-dependent task that requires semantic understanding. We evaluate it with three large language models on three datasets, comparing compositions to individual techniques and other baselines. The results underline the importance of finding an effective prompt composition. Our approach robustly ensures high detection performance, and is best in several settings. Moreover, first experiments on other tasks support its generalizability.
2017
pdf
bib
abs
Annotation Challenges for Reconstructing the Structural Elaboration of Middle Low German
Nina Seemann
|
Marie-Luis Merten
|
Michaela Geierhos
|
Doris Tophinke
|
Eyke Hüllermeier
Proceedings of the Joint SIGHUM Workshop on Computational Linguistics for Cultural Heritage, Social Sciences, Humanities and Literature
In this paper, we present the annotation challenges we have encountered when working on a historical language that was undergoing elaboration processes. We especially focus on syntactic ambiguity and gradience in Middle Low German, which causes uncertainty to some extent. Since current annotation tools consider construction contexts and the dynamics of the grammaticalization only partially, we plan to extend CorA - a web-based annotation tool for historical and other non-standard language data - to capture elaboration phenomena and annotator unsureness. Moreover, we seek to interactively learn morphological as well as syntactic annotations.
2013
pdf
bib
Learning to Rank Lexical Substitutions
György Szarvas
|
Róbert Busa-Fekete
|
Eyke Hüllermeier
Proceedings of the 2013 Conference on Empirical Methods in Natural Language Processing