- Anthology ID:
- 2021.findings-acl.387
- Volume:
- Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021
- Month:
- August
- Year:
- 2021
- Address:
- Online
- Editors:
- Chengqing Zong, Fei Xia, Wenjie Li, Roberto Navigli
- Venue:
- Findings
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 4408–4413
- Language:
- URL:
- https://aclanthology.org/2021.findings-acl.387
- DOI:
- 10.18653/v1/2021.findings-acl.387
- Cite (ACL):
- Chuhan Wu, Fangzhao Wu, and Yongfeng Huang. 2021. One Teacher is Enough? Pre-trained Language Model Distillation from Multiple Teachers. In Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021, pages 4408–4413, Online. Association for Computational Linguistics.
- Cite (Informal):
- One Teacher is Enough? Pre-trained Language Model Distillation from Multiple Teachers (Wu et al., Findings 2021)
- PDF:
- https://preview.aclanthology.org/nschneid-patch-4/2021.findings-acl.387.pdf
- Data
- MIND, SST, SST-2