PoeLM: A Meter- and Rhyme-Controllable Language Model for Unsupervised Poetry Generation

Aitor Ormazabal, Mikel Artetxe, Manex Agirrezabal, Aitor Soroa, Eneko Agirre


Abstract
Formal verse poetry imposes strict constraints on the meter and rhyme scheme of poems. Most prior work on generating this type of poetry uses existing poems for supervision, which are difficult to obtain for most languages and poetic forms. In this work, we propose an unsupervised approach to generate poems that follow any given meter and rhyme scheme, without requiring any poetic text for training. Our method works by splitting a regular, non-poetic corpus into phrases, prepending control codes that describe the length and end rhyme of each phrase, and training a transformer language model in the augmented corpus. The transformer learns to link the structure descriptor with the control codes to the number of lines, their length and their end rhyme. During inference, we build control codes for the desired meter and rhyme scheme, and condition our language model on them to generate formal verse poetry. Experiments in Spanish and Basque show that our approach is able to generate valid poems, which are often comparable in quality to those written by humans.
Anthology ID:
2022.findings-emnlp.268
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2022
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3655–3670
Language:
URL:
https://aclanthology.org/2022.findings-emnlp.268
DOI:
10.18653/v1/2022.findings-emnlp.268
Bibkey:
Cite (ACL):
Aitor Ormazabal, Mikel Artetxe, Manex Agirrezabal, Aitor Soroa, and Eneko Agirre. 2022. PoeLM: A Meter- and Rhyme-Controllable Language Model for Unsupervised Poetry Generation. In Findings of the Association for Computational Linguistics: EMNLP 2022, pages 3655–3670, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
Cite (Informal):
PoeLM: A Meter- and Rhyme-Controllable Language Model for Unsupervised Poetry Generation (Ormazabal et al., Findings 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/remove-xml-comments/2022.findings-emnlp.268.pdf