@inproceedings{thulke-etal-2025-listen,
    title = "Listen to the Context: Towards Faithful Large Language Models for Retrieval Augmented Generation on Climate Questions",
    author = "Thulke, David  and
      Kemmler, Jakob  and
      Dugast, Christian  and
      Ney, Hermann",
    editor = "Dutia, Kalyan  and
      Henderson, Peter  and
      Leippold, Markus  and
      Manning, Christoper  and
      Morio, Gaku  and
      Muccione, Veruska  and
      Ni, Jingwei  and
      Schimanski, Tobias  and
      Stammbach, Dominik  and
      Singh, Alok  and
      Su, Alba (Ruiran)  and
      A. Vaghefi, Saeid",
    booktitle = "Proceedings of the 2nd Workshop on Natural Language Processing Meets Climate Change (ClimateNLP 2025)",
    month = jul,
    year = "2025",
    address = "Vienna, Austria",
    publisher = "Association for Computational Linguistics",
    url = "https://preview.aclanthology.org/ingest-emnlp/2025.climatenlp-1.17/",
    doi = "10.18653/v1/2025.climatenlp-1.17",
    pages = "245--259",
    ISBN = "979-8-89176-259-6",
    abstract = "Large language models that use retrieval augmented generation have the potential to unlock valuable knowledge for researchers, policymakers, and the public by making long and technical climate-related documents more accessible. While this approach can help alleviate factual hallucinations by relying on retrieved passages as additional context, its effectiveness depends on whether the model{'}s output remains faithful to these passages. To address this, we explore the automatic assessment of faithfulness of different models in this setting. We then focus on ClimateGPT, a large language model specialised in climate science, to examine which factors in its instruction fine-tuning impact the model{'}s faithfulness. By excluding unfaithful subsets of the model{'}s training data, we develop ClimateGPT Faithful+, which achieves an improvement in faithfulness from 30{\%} to 57{\%} in supported atomic claims according to our automatic metric."
}Markdown (Informal)
[Listen to the Context: Towards Faithful Large Language Models for Retrieval Augmented Generation on Climate Questions](https://preview.aclanthology.org/ingest-emnlp/2025.climatenlp-1.17/) (Thulke et al., ClimateNLP 2025)
ACL