Poetry in RAGs: Modern Greek interwar poetry generation using RAG and contrastive training

Stergios Chatzikyriakidis, Anastasia Natsina


Abstract
In this paper, we discuss Modern Greek poetry generation in the style of lesser known Greek poets of the interwar period. The paper proposes the use of Retrieval-Augmented Generation (RAG) to automatically generate poetry using Large Language Models (LLMs). A corpus of Greek interwar poetry is used and prompts exemplifying the poet’s style with respect to a theme are created. These are then fed to an LLM. The results are compared to pure LLM generation and expert evaluators score poems across a number of parameters. Objective metrics such as Vocabulary Density, Average words per Sentence and Readability Index are also used to assess the performance of the models. RAG-assisted models show potential in enhancing poetry generation across a number of parameters. Base LLM models appear quite consistent across a number of categories, while the RAG model that is furthermore contrastive shows the worst performance of the three.
Anthology ID:
2025.nlp4dh-1.22
Volume:
Proceedings of the 5th International Conference on Natural Language Processing for Digital Humanities
Month:
May
Year:
2025
Address:
Albuquerque, USA
Editors:
Mika Hämäläinen, Emily Öhman, Yuri Bizzoni, So Miyagawa, Khalid Alnajjar
Venues:
NLP4DH | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
257–264
Language:
URL:
https://preview.aclanthology.org/Ingest-2025-COMPUTEL/2025.nlp4dh-1.22/
DOI:
Bibkey:
Cite (ACL):
Stergios Chatzikyriakidis and Anastasia Natsina. 2025. Poetry in RAGs: Modern Greek interwar poetry generation using RAG and contrastive training. In Proceedings of the 5th International Conference on Natural Language Processing for Digital Humanities, pages 257–264, Albuquerque, USA. Association for Computational Linguistics.
Cite (Informal):
Poetry in RAGs: Modern Greek interwar poetry generation using RAG and contrastive training (Chatzikyriakidis & Natsina, NLP4DH 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/Ingest-2025-COMPUTEL/2025.nlp4dh-1.22.pdf