Pre-Learning Environment Representations for Data-Efficient Neural Instruction Following

David Gaddy, Dan Klein


Abstract
We consider the problem of learning to map from natural language instructions to state transitions (actions) in a data-efficient manner. Our method takes inspiration from the idea that it should be easier to ground language to concepts that have already been formed through pre-linguistic observation. We augment a baseline instruction-following learner with an initial environment-learning phase that uses observations of language-free state transitions to induce a suitable latent representation of actions before processing the instruction-following training data. We show that mapping to pre-learned representations substantially improves performance over systems whose representations are learned from limited instructional data alone.
Anthology ID:
P19-1188
Volume:
Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2019
Address:
Florence, Italy
Editors:
Anna Korhonen, David Traum, Lluís Màrquez
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1946–1956
Language:
URL:
https://aclanthology.org/P19-1188
DOI:
10.18653/v1/P19-1188
Bibkey:
Cite (ACL):
David Gaddy and Dan Klein. 2019. Pre-Learning Environment Representations for Data-Efficient Neural Instruction Following. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pages 1946–1956, Florence, Italy. Association for Computational Linguistics.
Cite (Informal):
Pre-Learning Environment Representations for Data-Efficient Neural Instruction Following (Gaddy & Klein, ACL 2019)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-5/P19-1188.pdf
Software:
 P19-1188.Software.zip
Video:
 https://preview.aclanthology.org/nschneid-patch-5/P19-1188.mp4
Code
 dgaddy/environment-learning