Game Development as Human-LLM Interaction

Jiale Hong, Hongqiu Wu, Hai Zhao


Abstract
Game development is a highly specialized task that relies on a complex game engine powered by complex programming languages, preventing many gaming enthusiasts from handling it. This paper introduces the Chat Game Engine (ChatGE) powered by LLM, which allows everyone to develop a custom game using natural language through Human-LLM interaction. To enable an LLM to function as a ChatGE, we instruct it to perform the following processes in each turn: (1) Pscript: configure the game script segment based on the user’s input; (2) Pcode: generate the corresponding code snippet based on the game script segment; (3) Putter: interact with the user, including guidance and feedback. We propose a data synthesis pipeline based on LLM to generate game script-code pairs and interactions from a few manually crafted seed data. We propose a three-stage training strategy following curriculum learning principles to transfer the dialogue-based LLM to our ChatGE smoothly. We construct a ChatGE for poker games as a case study and comprehensively evaluate it from two perspectives: interaction quality and code correctness.
Anthology ID:
2025.acl-long.218
Volume:
Proceedings of the 63rd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2025
Address:
Vienna, Austria
Editors:
Wanxiang Che, Joyce Nabende, Ekaterina Shutova, Mohammad Taher Pilehvar
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4333–4354
Language:
URL:
https://preview.aclanthology.org/ingestion-acl-25/2025.acl-long.218/
DOI:
Bibkey:
Cite (ACL):
Jiale Hong, Hongqiu Wu, and Hai Zhao. 2025. Game Development as Human-LLM Interaction. In Proceedings of the 63rd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 4333–4354, Vienna, Austria. Association for Computational Linguistics.
Cite (Informal):
Game Development as Human-LLM Interaction (Hong et al., ACL 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-acl-25/2025.acl-long.218.pdf