Optimizing the Training of Models for Automated Post-Correction of Arbitrary OCR-ed Historical Texts

Tobias Englmeier, Florian Fink, Uwe Springmann, Klaus U. Schulz


Anthology ID:
2022.jlcl-1.1
Volume:
Journal for Language Technology and Computational Linguistics, Vol. 35 No. 1
Month:
Dec.
Year:
2022
Address:
Germany
Editor:
Christian Wartena
Venue:
JLCL
SIG:
Publisher:
German Society for Computational Lingustics and Language Technology
Note:
Pages:
1–27
Language:
URL:
https://preview.aclanthology.org/moar-dois/2022.jlcl-1.1/
DOI:
10.21248/jlcl.35.2022.232
Bibkey:
Cite (ACL):
Tobias Englmeier, Florian Fink, Uwe Springmann, and Klaus U. Schulz. 2022. Optimizing the Training of Models for Automated Post-Correction of Arbitrary OCR-ed Historical Texts. Journal for Language Technology and Computational Linguistics, 35(1):1–27.
Cite (Informal):
Optimizing the Training of Models for Automated Post-Correction of Arbitrary OCR-ed Historical Texts (Englmeier et al., JLCL 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/moar-dois/2022.jlcl-1.1.pdf