Are LLMs Aware that Some Questions are not Open-ended?

Dongjie Yang, Hai Zhao


Abstract
Large Language Models (LLMs) have shown the impressive capability of answering questions in a wide range of scenarios. However, when LLMs face different types of questions, it is worth exploring whether LLMs are aware that some questions have limited answers and need to respond more deterministically but some do not. We refer to this as question awareness of LLMs. The lack of question awareness in LLMs leads to two phenomena that LLMs are: (1) too casual to answer non-open-ended questions or (2) too boring to answer open-ended questions. In this paper, we first evaluate the question awareness in LLMs. The experimental results show that LLMs have the issues of lacking awareness of questions in certain domains, e.g. factual knowledge, resulting in hallucinations during the generation. To mitigate these, we propose a method called Question Awareness Temperature Sampling (QuATS). This method enhances the question awareness of LLMs by adaptively adjusting the output distributions based on question features. The automatic adjustment in QuATS eliminates the need for manual temperature tuning in text generation and consistently improves model performance in various benchmarks.
Anthology ID:
2024.findings-emnlp.117
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2024
Month:
November
Year:
2024
Address:
Miami, Florida, USA
Editors:
Yaser Al-Onaizan, Mohit Bansal, Yun-Nung Chen
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2142–2152
Language:
URL:
https://preview.aclanthology.org/build-pipeline-with-new-library/2024.findings-emnlp.117/
DOI:
10.18653/v1/2024.findings-emnlp.117
Bibkey:
Cite (ACL):
Dongjie Yang and Hai Zhao. 2024. Are LLMs Aware that Some Questions are not Open-ended?. In Findings of the Association for Computational Linguistics: EMNLP 2024, pages 2142–2152, Miami, Florida, USA. Association for Computational Linguistics.
Cite (Informal):
Are LLMs Aware that Some Questions are not Open-ended? (Yang & Zhao, Findings 2024)
Copy Citation:
PDF:
https://preview.aclanthology.org/build-pipeline-with-new-library/2024.findings-emnlp.117.pdf