Abstract
Language models (LMs) often generate incoherent outputs: they refer to events and entity states that are incompatible with the state of the world described in inputs. We introduce SITUATIONSUPERVISION, a family of approaches for improving coherence in LMs by training them to construct and condition on explicit representations of entities and their states. SITUATIONSUPERVISION has two components: an *auxiliary situation modeling* task that trains models to predict entity state representations in context, and a *latent state inference* procedure that imputes these states from partially annotated training data. SITUATIONSUPERVISION can be applied via fine-tuning (by supervising LMs to encode state variables in their hidden representations) and prompting (by inducing LMs to interleave textual descriptions of entity states with output text). In both cases, it requires only a small number of state annotations to produce substantial coherence improvements (up to an 16% reduction in errors), showing that standard LMs can be efficiently adapted to explicitly model language and aspects of its meaning.- Anthology ID:
- 2023.findings-acl.795
- Volume:
- Findings of the Association for Computational Linguistics: ACL 2023
- Month:
- July
- Year:
- 2023
- Address:
- Toronto, Canada
- Editors:
- Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
- Venue:
- Findings
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 12556–12571
- Language:
- URL:
- https://aclanthology.org/2023.findings-acl.795
- DOI:
- 10.18653/v1/2023.findings-acl.795
- Cite (ACL):
- Belinda Z. Li, Maxwell Nye, and Jacob Andreas. 2023. Language Modeling with Latent Situations. In Findings of the Association for Computational Linguistics: ACL 2023, pages 12556–12571, Toronto, Canada. Association for Computational Linguistics.
- Cite (Informal):
- Language Modeling with Latent Situations (Li et al., Findings 2023)
- PDF:
- https://preview.aclanthology.org/landing_page/2023.findings-acl.795.pdf