BERT has a Mouth, and It Must Speak: BERT as a Markov Random Field Language Model

Alex Wang, Kyunghyun Cho

We show that BERT (Devlin et al., 2018) is a Markov random field language model. This formulation gives way to a natural procedure to sample sentences from BERT. We generate from BERT and find that it can produce high-quality, fluent generations. Compared to the generations of a traditional left-to-right language model, BERT generates sentences that are more diverse but of slightly worse quality.

Knowledge Graph

arrow_drop_up

Comments

Sign up or login to leave a comment