Learning with Latent Language

Jacob Andreas, Dan Klein, Sergey Levine


Abstract
The named concepts and compositional operators present in natural language provide a rich source of information about the abstractions humans use to navigate the world. Can this linguistic background knowledge improve the generality and efficiency of learned classifiers and control policies? This paper aims to show that using the space of natural language strings as a parameter space is an effective way to capture natural task structure. In a pretraining phase, we learn a language interpretation model that transforms inputs (e.g. images) into outputs (e.g. labels) given natural language descriptions. To learn a new concept (e.g. a classifier), we search directly in the space of descriptions to minimize the interpreter’s loss on training examples. Crucially, our models do not require language data to learn these concepts: language is used only in pretraining to impose structure on subsequent learning. Results on image classification, text editing, and reinforcement learning show that, in all settings, models with a linguistic parameterization outperform those without.
Anthology ID:
N18-1197
Volume:
Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long Papers)
Month:
June
Year:
2018
Address:
New Orleans, Louisiana
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2166–2179
Language:
URL:
https://aclanthology.org/N18-1197
DOI:
10.18653/v1/N18-1197
Bibkey:
Cite (ACL):
Jacob Andreas, Dan Klein, and Sergey Levine. 2018. Learning with Latent Language. In Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long Papers), pages 2166–2179, New Orleans, Louisiana. Association for Computational Linguistics.
Cite (Informal):
Learning with Latent Language (Andreas et al., NAACL 2018)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-script-update/N18-1197.pdf
Code
 jacobandreas/l3