- Anthology ID:
- 2023.conll-babylm.26
- Volume:
- Proceedings of the BabyLM Challenge at the 27th Conference on Computational Natural Language Learning
- Month:
- December
- Year:
- 2023
- Address:
- Singapore
- Editors:
- Alex Warstadt, Aaron Mueller, Leshem Choshen, Ethan Wilcox, Chengxu Zhuang, Juan Ciro, Rafael Mosquera, Bhargavi Paranjabe, Adina Williams, Tal Linzen, Ryan Cotterell
- Venue:
- CoNLL
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 298–307
- Language:
- URL:
- https://aclanthology.org/2023.conll-babylm.26
- DOI:
- 10.18653/v1/2023.conll-babylm.26
- Cite (ACL):
- Gábor Berend. 2023. Better Together: Jointly Using Masked Latent Semantic Modeling and Masked Language Modeling for Sample Efficient Pre-training. In Proceedings of the BabyLM Challenge at the 27th Conference on Computational Natural Language Learning, pages 298–307, Singapore. Association for Computational Linguistics.
- Cite (Informal):
- Better Together: Jointly Using Masked Latent Semantic Modeling and Masked Language Modeling for Sample Efficient Pre-training (Berend, CoNLL 2023)
- PDF:
- https://preview.aclanthology.org/emnlp-22-attachments/2023.conll-babylm.26.pdf