Kervy Dante Rivas Rojas
Also published as: Kervy Dante Rivas Rojas
2022
Revisiting Syllables in Language Modelling and Their Application on Low-Resource Machine Translation
Arturo Oncevay
|
Kervy Dante Rivas Rojas
|
Liz Karen Chavez Sanchez
|
Roberto Zariquiey
Proceedings of the 29th International Conference on Computational Linguistics
Language modelling and machine translation tasks mostly use subword or character inputs, but syllables are seldom used. Syllables provide shorter sequences than characters, require less-specialised extracting rules than morphemes, and their segmentation is not impacted by the corpus size. In this study, we first explore the potential of syllables for open-vocabulary language modelling in 21 languages. We use rule-based syllabification methods for six languages and address the rest with hyphenation, which works as a syllabification proxy. With a comparable perplexity, we show that syllables outperform characters and other subwords. Moreover, we study the importance of syllables on neural machine translation for a non-related and low-resource language-pair (Spanish–Shipibo-Konibo). In pairwise and multilingual systems, syllables outperform unsupervised subwords, and further morphological segmentation methods, when translating into a highly synthetic language with a transparent orthography (Shipibo-Konibo). Finally, we perform some human evaluation, and discuss limitations and opportunities.
2019
A Continuous Improvement Framework of Machine Translation for Shipibo-Konibo
Héctor Erasmo Gómez Montoya
|
Kervy Dante Rivas Rojas
|
Arturo Oncevay
Proceedings of the 2nd Workshop on Technologies for MT of Low Resource Languages