Extracting Linguistic Information from Large Language Models: Syntactic Relations and Derivational Knowledge

Tsedeniya Kinfe Temesgen, Marion Di Marco, Alexander Fraser


Abstract
This paper presents a study of the linguistic knowledge and generalization capabilities of Large Language Models (LLMs), focusing ontheir morphosyntactic competence. We design three diagnostic tasks: (i) labeling syntactic information at the sentence level - identifying subjects, objects, and indirect objects; (ii) derivational decomposition at the word level - identifying morpheme boundaries and labeling thedecomposed sequence; and (iii) in-depth study of morphological decomposition in German and Amharic. We evaluate prompting strategies in GPT-4o and LLaMA 3.3-70B to extract different types of linguistic structure for typologically diverse languages. Our results showthat GPT-4o consistently outperforms LLaMA in all tasks; however, both models exhibit limitations and show little evidence of abstract morphological rule learning. Importantly, we show strong evidence that the models fail to learn underlying morphological structures. Therefore,raising important doubts about their ability to generalize.
Anthology ID:
2025.emnlp-main.1384
Volume:
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
27198–27214
Language:
URL:
https://preview.aclanthology.org/author-page-yu-wang-polytechnic/2025.emnlp-main.1384/
DOI:
10.18653/v1/2025.emnlp-main.1384
Bibkey:
Cite (ACL):
Tsedeniya Kinfe Temesgen, Marion Di Marco, and Alexander Fraser. 2025. Extracting Linguistic Information from Large Language Models: Syntactic Relations and Derivational Knowledge. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing, pages 27198–27214, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
Extracting Linguistic Information from Large Language Models: Syntactic Relations and Derivational Knowledge (Temesgen et al., EMNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/author-page-yu-wang-polytechnic/2025.emnlp-main.1384.pdf
Checklist:
 2025.emnlp-main.1384.checklist.pdf