@inproceedings{sieker-zarriess-2023-language,
    title = "When Your Language Model Cannot {E}ven Do Determiners Right: Probing for Anti-Presuppositions and the Maximize Presupposition! Principle",
    author = "Sieker, Judith  and
      Zarrie{\ss}, Sina",
    editor = "Belinkov, Yonatan  and
      Hao, Sophie  and
      Jumelet, Jaap  and
      Kim, Najoung  and
      McCarthy, Arya  and
      Mohebbi, Hosein",
    booktitle = "Proceedings of the 6th BlackboxNLP Workshop: Analyzing and Interpreting Neural Networks for NLP",
    month = dec,
    year = "2023",
    address = "Singapore",
    publisher = "Association for Computational Linguistics",
    url = "https://preview.aclanthology.org/ingest-emnlp/2023.blackboxnlp-1.14/",
    doi = "10.18653/v1/2023.blackboxnlp-1.14",
    pages = "180--198",
    abstract = "The increasing interest in probing the linguistic capabilities of large language models (LLMs) has long reached the area of semantics and pragmatics, including the phenomenon of presuppositions. In this study, we investigate a phenomenon that, however, has not yet been investigated, i.e., the phenomenon of anti-presupposition and the principle that accounts for it, the Maximize Presupposition! principle (MP!). Through an experimental investigation using psycholinguistic data and four open-source BERT model variants, we explore how language models handle different anti-presuppositions and whether they apply the MP! principle in their predictions. Further, we examine whether fine-tuning with Natural Language Inference data impacts adherence to the MP! principle. Our findings reveal that LLMs tend to replicate context-based n-grams rather than follow the MP! principle, with fine-tuning not enhancing their adherence. Notably, our results further indicate a striking difficulty of LLMs to correctly predict determiners, in relatively simple linguistic contexts."
}Markdown (Informal)
[When Your Language Model Cannot Even Do Determiners Right: Probing for Anti-Presuppositions and the Maximize Presupposition! Principle](https://preview.aclanthology.org/ingest-emnlp/2023.blackboxnlp-1.14/) (Sieker & Zarrieß, BlackboxNLP 2023)
ACL