Abstract
Compositionality is a hallmark of human language that not only enables linguistic generalization, but also potentially facilitates acquisition. When simulating language emergence with neural networks, compositionality has been shown to improve communication performance; however, its impact on imitation learning has yet to be investigated. Our work explores the link between compositionality and imitation in a Lewis game played by deep neural agents. Our contributions are twofold: first, we show that the learning algorithm used to imitate is crucial: supervised learning tends to produce more average languages, while reinforcement learning introduces a selection pressure toward more compositional languages. Second, our study reveals that compositional languages are easier to imitate, which may induce the pressure toward compositional languages in RL imitation settings.- Anthology ID:
- 2023.findings-acl.787
- Volume:
- Findings of the Association for Computational Linguistics: ACL 2023
- Month:
- July
- Year:
- 2023
- Address:
- Toronto, Canada
- Editors:
- Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
- Venue:
- Findings
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 12432–12447
- Language:
- URL:
- https://aclanthology.org/2023.findings-acl.787
- DOI:
- 10.18653/v1/2023.findings-acl.787
- Cite (ACL):
- Emily Cheng, Mathieu Rita, and Thierry Poibeau. 2023. On the Correspondence between Compositionality and Imitation in Emergent Neural Communication. In Findings of the Association for Computational Linguistics: ACL 2023, pages 12432–12447, Toronto, Canada. Association for Computational Linguistics.
- Cite (Informal):
- On the Correspondence between Compositionality and Imitation in Emergent Neural Communication (Cheng et al., Findings 2023)
- PDF:
- https://preview.aclanthology.org/proper-vol2-ingestion/2023.findings-acl.787.pdf