@inproceedings{estienne-2023-unsupervised,
    title = "Unsupervised Calibration through Prior Adaptation for Text Classification using Large Language Models",
    author = "Estienne, Lautaro",
    editor = "Hardalov, Momchil  and
      Kancheva, Zara  and
      Velichkov, Boris  and
      Nikolova-Koleva, Ivelina  and
      Slavcheva, Milena",
    booktitle = "Proceedings of the 8th Student Research Workshop associated with the International Conference Recent Advances in Natural Language Processing",
    month = sep,
    year = "2023",
    address = "Varna, Bulgaria",
    publisher = "INCOMA Ltd., Shoumen, Bulgaria",
    url = "https://preview.aclanthology.org/ingest-emnlp/2023.ranlp-stud.2/",
    pages = "13--22",
    abstract = "A wide variety of natural language tasks are currently being addressed with large-scale language models (LLMs). These models are usually trained with a very large amount of unsupervised text data and adapted to perform a downstream natural language task using methods like fine-tuning, calibration or in-context learning. In this work, we propose an approach to adapt the prior class distribution to perform text classification tasks without the need for labelled samples and only a few in-domain sample queries. The proposed approach treats the LLM as a black box, adding a stage where the model posteriors are calibrated to the task. Results show that these methods outperform the un-adapted model for different number of training shots in the prompt and a previous approach where calibration is performed without using any adaptation data."
}Markdown (Informal)
[Unsupervised Calibration through Prior Adaptation for Text Classification using Large Language Models](https://preview.aclanthology.org/ingest-emnlp/2023.ranlp-stud.2/) (Estienne, RANLP 2023)
ACL