Skill Induction and Planning with Latent Language

Pratyusha Sharma, Antonio Torralba, Jacob Andreas


Abstract
We present a framework for learning hierarchical policies from demonstrations, using sparse natural language annotations to guide the discovery of reusable skills for autonomous decision-making. We formulate a generative model of action sequences in which goals generate sequences of high-level subtask descriptions, and these descriptions generate sequences of low-level actions. We describe how to train this model using primarily unannotated demonstrations by parsing demonstrations into sequences of named high-level sub-tasks, using only a small number of seed annotations to ground language in action. In trained models, natural language commands index a combinatorial library of skills; agents can use these skills to plan by generating high-level instruction sequences tailored to novel goals. We evaluate this approach in the ALFRED household simulation environment, providing natural language annotations for only 10% of demonstrations. It achieves performance comparable state-of-the-art models on ALFRED success rate, outperforming several recent methods with access to ground-truth plans during training and evaluation.
Anthology ID:
2022.acl-long.120
Volume:
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
May
Year:
2022
Address:
Dublin, Ireland
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1713–1726
Language:
URL:
https://aclanthology.org/2022.acl-long.120
DOI:
10.18653/v1/2022.acl-long.120
Bibkey:
Cite (ACL):
Pratyusha Sharma, Antonio Torralba, and Jacob Andreas. 2022. Skill Induction and Planning with Latent Language. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 1713–1726, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Skill Induction and Planning with Latent Language (Sharma et al., ACL 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/paclic-22-ingestion/2022.acl-long.120.pdf
Video:
 https://preview.aclanthology.org/paclic-22-ingestion/2022.acl-long.120.mp4
Data
ALFRED