BERT has a Mouth, and It Must Speak: BERT as a Markov Random Field Language Model

Alex Wang, Kyunghyun Cho


Abstract
We show that BERT (Devlin et al., 2018) is a Markov random field language model. This formulation gives way to a natural procedure to sample sentences from BERT. We generate from BERT and find that it can produce high quality, fluent generations. Compared to the generations of a traditional left-to-right language model, BERT generates sentences that are more diverse but of slightly worse quality.
Anthology ID:
W19-2304
Volume:
Proceedings of the Workshop on Methods for Optimizing and Evaluating Neural Language Generation
Month:
June
Year:
2019
Address:
Minneapolis, Minnesota
Editors:
Antoine Bosselut, Asli Celikyilmaz, Marjan Ghazvininejad, Srinivasan Iyer, Urvashi Khandelwal, Hannah Rashkin, Thomas Wolf
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
30–36
Language:
URL:
https://aclanthology.org/W19-2304
DOI:
10.18653/v1/W19-2304
Bibkey:
Cite (ACL):
Alex Wang and Kyunghyun Cho. 2019. BERT has a Mouth, and It Must Speak: BERT as a Markov Random Field Language Model. In Proceedings of the Workshop on Methods for Optimizing and Evaluating Neural Language Generation, pages 30–36, Minneapolis, Minnesota. Association for Computational Linguistics.
Cite (Informal):
BERT has a Mouth, and It Must Speak: BERT as a Markov Random Field Language Model (Wang & Cho, NAACL 2019)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-3/W19-2304.pdf
Code
 kyunghyuncho/bert-gen +  additional community code
Data
BookCorpusWikiText-103