Probing Subphonemes in Morphology Models

Gal Astrach, Yuval Pinter


Abstract
Transformers have achieved state-of-the-art performance in morphological inflection tasks, yet their ability to generalize across languages and morphological rules remains limited. One possible explanation for this behavior can be the degree to which these models are able to capture implicit phenomena at the phonological and subphonemic levels. We introduce a language-agnostic probing method to investigate phonological feature encoding in transformers trained directly on phonemes, and perform it across seven morphologically diverse languages. We show that phonological features which are local, such as final-obstruent devoicing in Turkish, are captured well in phoneme embeddings, whereas long-distance dependencies like vowel harmony are better represented in the transformer’s encoder. Finally, we discuss how these findings inform empirical strategies for training morphological models, particularly regarding the role of subphonemic feature acquisition.
Anthology ID:
2025.findings-acl.672
Volume:
Findings of the Association for Computational Linguistics: ACL 2025
Month:
July
Year:
2025
Address:
Vienna, Austria
Editors:
Wanxiang Che, Joyce Nabende, Ekaterina Shutova, Mohammad Taher Pilehvar
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
12954–12961
Language:
URL:
https://preview.aclanthology.org/landing_page/2025.findings-acl.672/
DOI:
Bibkey:
Cite (ACL):
Gal Astrach and Yuval Pinter. 2025. Probing Subphonemes in Morphology Models. In Findings of the Association for Computational Linguistics: ACL 2025, pages 12954–12961, Vienna, Austria. Association for Computational Linguistics.
Cite (Informal):
Probing Subphonemes in Morphology Models (Astrach & Pinter, Findings 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/landing_page/2025.findings-acl.672.pdf