enunlg: a Python library for reproducible neural data-to-text experimentation

David M. Howcroft, Dimitra Gkatzia


Abstract
Over the past decade, a variety of neural architectures for data-to-text generation (NLG) have been proposed. However, each system typically has its own approach to pre- and post-processing and other implementation details. Diversity in implementations is desirable, but it also confounds attempts to compare model performance: are the differences due to the proposed architectures or are they a byproduct of the libraries used or a result of pre- and post-processing decisions made? To improve reproducibility, we re-implement several pre-Transformer neural models for data-to-text NLG within a single framework to facilitate direct comparisons of the models themselves and better understand the contributions of other design choices. We release our library at https://github.com/NapierNLP/enunlg to serve as a baseline for ongoing work in this area including research on NLG for low-resource languages where transformers might not be optimal.
Anthology ID:
2023.inlg-demos.2
Volume:
Proceedings of the 16th International Natural Language Generation Conference: System Demonstrations
Month:
September
Year:
2023
Address:
Prague, Czechia
Editors:
C. Maria Keet, Hung-Yi Lee, Sina Zarrieß
Venues:
INLG | SIGDIAL
SIG:
SIGGEN
Publisher:
Association for Computational Linguistics
Note:
Pages:
4–5
Language:
URL:
https://aclanthology.org/2023.inlg-demos.2
DOI:
Bibkey:
Cite (ACL):
David M. Howcroft and Dimitra Gkatzia. 2023. enunlg: a Python library for reproducible neural data-to-text experimentation. In Proceedings of the 16th International Natural Language Generation Conference: System Demonstrations, pages 4–5, Prague, Czechia. Association for Computational Linguistics.
Cite (Informal):
enunlg: a Python library for reproducible neural data-to-text experimentation (Howcroft & Gkatzia, INLG-SIGDIAL 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-3/2023.inlg-demos.2.pdf