Augmented Close Reading for Classical Latin using BERT for Intertextual Exploration

Ashley Gong, Katy Gero, Mark Schiefsky


Abstract
Intertextuality, the connection between texts, is a critical literary concept for analyzing classical Latin works. Given the emergence of AI in digital humanities, this paper presents Intertext.AI, a novel interface that leverages Latin BERT (Bamman and Burns 2020), a BERT model trained on classical Latin texts, and contextually rich visualizations to help classicists find potential intertextual connections. Intertext.AI identified over 80% of attested allusions from excerpts of Lucan's Pharsalia, demonstrating the system's technical efficacy. Our findings from a user study with 19 participants also suggest that Intertext.AI fosters intertextual discovery and interpretation more easily than other tools. While participants did not identify significantly different types or quantities of connections when using Intertext.AI or other tools, they overall found finding and justifying potential intertextuality easier with Intertext.AI, reported higher confidence in their observations from Intertext.AI, and preferred having access to it during the search process.
Anthology ID:
2025.nlp4dh-1.35
Volume:
Proceedings of the 5th International Conference on Natural Language Processing for Digital Humanities
Month:
May
Year:
2025
Address:
Albuquerque, USA
Editors:
Mika Hämäläinen, Emily Öhman, Yuri Bizzoni, So Miyagawa, Khalid Alnajjar
Venues:
NLP4DH | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
403–417
Language:
URL:
https://preview.aclanthology.org/corrections-2025-06/2025.nlp4dh-1.35/
DOI:
10.18653/v1/2025.nlp4dh-1.35
Bibkey:
Cite (ACL):
Ashley Gong, Katy Gero, and Mark Schiefsky. 2025. Augmented Close Reading for Classical Latin using BERT for Intertextual Exploration. In Proceedings of the 5th International Conference on Natural Language Processing for Digital Humanities, pages 403–417, Albuquerque, USA. Association for Computational Linguistics.
Cite (Informal):
Augmented Close Reading for Classical Latin using BERT for Intertextual Exploration (Gong et al., NLP4DH 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/corrections-2025-06/2025.nlp4dh-1.35.pdf