PLM-based World Models for Text-based Games

Minsoo Kim, Yeonjoon Jung, Dohyeon Lee, Seung-won Hwang


Abstract
World models have improved the ability of reinforcement learning agents to operate in a sample efficient manner, by being trained to predict plausible changes in the underlying environment. As the core tasks of world models are future prediction and commonsense understanding, our claim is that pre-trained language models (PLMs) already provide a strong base upon which to build world models. Worldformer is a recently proposed world model for text-based game environments, based only partially on PLM and transformers. Our distinction is to fully leverage PLMs as actionable world models in text-based game environments, by reformulating generation as constrained decoding which decomposes actions into verb templates and objects. We show that our model improves future valid action prediction and graph change prediction. Additionally, we show that our model better reflects commonsense than standard PLM.
Anthology ID:
2022.emnlp-main.86
Volume:
Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1324–1341
Language:
URL:
https://aclanthology.org/2022.emnlp-main.86
DOI:
Bibkey:
Cite (ACL):
Minsoo Kim, Yeonjoon Jung, Dohyeon Lee, and Seung-won Hwang. 2022. PLM-based World Models for Text-based Games. In Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, pages 1324–1341, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
Cite (Informal):
PLM-based World Models for Text-based Games (Kim et al., EMNLP 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-script-update/2022.emnlp-main.86.pdf