Understand User Opinions of Large Language Models via LLM-Powered In-the-Moment User Experience Interviews
Mengqiao Liu, Tevin Wang, Cassandra A. Cohen, Sarah Li, Chenyan Xiong
Abstract
Which large language model (LLM) is better? Every evaluation tells a story, but what do users really think about current LLMs? This paper presents CLUE, an LLM-powered interviewer that conducts in-the-moment user experience interviews, right after users interact with LLMs, and automatically gathers insights about user opinions from massive interview logs. We conduct a study with thousands of users to understand user opinions on mainstream LLMs, recruiting users to first chat with a target LLM and then be interviewed by CLUE. Our experiments demonstrate that CLUE captures interesting user opinions, e.g., the bipolar views on the displayed reasoning process of DeepSeek-R1 and demands for information freshness and multi-modality. Our code and data are at https://github.com/cxcscmu/LLM-Interviewer.- Anthology ID:
- 2025.findings-acl.714
- Volume:
- Findings of the Association for Computational Linguistics: ACL 2025
- Month:
- July
- Year:
- 2025
- Address:
- Vienna, Austria
- Editors:
- Wanxiang Che, Joyce Nabende, Ekaterina Shutova, Mohammad Taher Pilehvar
- Venue:
- Findings
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 13872–13893
- Language:
- URL:
- https://preview.aclanthology.org/transition-to-people-yaml/2025.findings-acl.714/
- DOI:
- 10.18653/v1/2025.findings-acl.714
- Cite (ACL):
- Mengqiao Liu, Tevin Wang, Cassandra A. Cohen, Sarah Li, and Chenyan Xiong. 2025. Understand User Opinions of Large Language Models via LLM-Powered In-the-Moment User Experience Interviews. In Findings of the Association for Computational Linguistics: ACL 2025, pages 13872–13893, Vienna, Austria. Association for Computational Linguistics.
- Cite (Informal):
- Understand User Opinions of Large Language Models via LLM-Powered In-the-Moment User Experience Interviews (Liu et al., Findings 2025)
- PDF:
- https://preview.aclanthology.org/transition-to-people-yaml/2025.findings-acl.714.pdf