Tiny Language Models Enriched with Multimodal Knowledge from Multiplex Networks
Clayton Fields, Osama Natouf, Andrew McMains, Catherine Henry, Casey Kennington
- Anthology ID:
- 2023.conll-babylm.3
- Volume:
- Proceedings of the BabyLM Challenge at the 27th Conference on Computational Natural Language Learning
- Month:
- December
- Year:
- 2023
- Address:
- Singapore
- Editors:
- Alex Warstadt, Aaron Mueller, Leshem Choshen, Ethan Wilcox, Chengxu Zhuang, Juan Ciro, Rafael Mosquera, Bhargavi Paranjabe, Adina Williams, Tal Linzen, Ryan Cotterell
- Venue:
- CoNLL
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 47–57
- Language:
- URL:
- https://aclanthology.org/2023.conll-babylm.3
- DOI:
- 10.18653/v1/2023.conll-babylm.3
- Cite (ACL):
- Clayton Fields, Osama Natouf, Andrew McMains, Catherine Henry, and Casey Kennington. 2023. Tiny Language Models Enriched with Multimodal Knowledge from Multiplex Networks. In Proceedings of the BabyLM Challenge at the 27th Conference on Computational Natural Language Learning, pages 47–57, Singapore. Association for Computational Linguistics.
- Cite (Informal):
- Tiny Language Models Enriched with Multimodal Knowledge from Multiplex Networks (Fields et al., CoNLL 2023)
- PDF:
- https://preview.aclanthology.org/emnlp22-frontmatter/2023.conll-babylm.3.pdf