Towards more Human-like Language Models based on Contextualizer Pretraining Strategy

Chenghao Xiao, G Thomas Hudson, Noura Al Moubayed


Anthology ID:
2023.conll-babylm.28
Volume:
Proceedings of the BabyLM Challenge at the 27th Conference on Computational Natural Language Learning
Month:
December
Year:
2023
Address:
Singapore
Editors:
Alex Warstadt, Aaron Mueller, Leshem Choshen, Ethan Wilcox, Chengxu Zhuang, Juan Ciro, Rafael Mosquera, Bhargavi Paranjabe, Adina Williams, Tal Linzen, Ryan Cotterell
Venues:
CoNLL | BabyLM | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
317–326
Language:
URL:
https://preview.aclanthology.org/Add-Cong-Liu-Florida-Atlantic-University-author-id/2023.conll-babylm.28/
DOI:
10.18653/v1/2023.conll-babylm.28
Bibkey:
Cite (ACL):
Chenghao Xiao, G Thomas Hudson, and Noura Al Moubayed. 2023. Towards more Human-like Language Models based on Contextualizer Pretraining Strategy. In Proceedings of the BabyLM Challenge at the 27th Conference on Computational Natural Language Learning, pages 317–326, Singapore. Association for Computational Linguistics.
Cite (Informal):
Towards more Human-like Language Models based on Contextualizer Pretraining Strategy (Xiao et al., CoNLL-BabyLM 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/Add-Cong-Liu-Florida-Atlantic-University-author-id/2023.conll-babylm.28.pdf