Baby’s CoThought: Leveraging Large Language Models for Enhanced Reasoning in Compact Models

Zheyu Zhang, Han Yang, Bolei Ma, David Rügamer, Ercong Nie


Anthology ID:
2023.conll-babylm.13
Volume:
Proceedings of the BabyLM Challenge at the 27th Conference on Computational Natural Language Learning
Month:
December
Year:
2023
Address:
Singapore
Editors:
Alex Warstadt, Aaron Mueller, Leshem Choshen, Ethan Wilcox, Chengxu Zhuang, Juan Ciro, Rafael Mosquera, Bhargavi Paranjabe, Adina Williams, Tal Linzen, Ryan Cotterell
Venues:
CoNLL | BabyLM | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
158–170
Language:
URL:
https://preview.aclanthology.org/ingest_wac_2008/2023.conll-babylm.13/
DOI:
10.18653/v1/2023.conll-babylm.13
Bibkey:
Cite (ACL):
Zheyu Zhang, Han Yang, Bolei Ma, David Rügamer, and Ercong Nie. 2023. Baby’s CoThought: Leveraging Large Language Models for Enhanced Reasoning in Compact Models. In Proceedings of the BabyLM Challenge at the 27th Conference on Computational Natural Language Learning, pages 158–170, Singapore. Association for Computational Linguistics.
Cite (Informal):
Baby’s CoThought: Leveraging Large Language Models for Enhanced Reasoning in Compact Models (Zhang et al., CoNLL-BabyLM 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest_wac_2008/2023.conll-babylm.13.pdf