The unreasonable effectiveness of large language models for low-resource clause-level morphology: In-context generalization or prior exposure?

Coleman Haley


Abstract
This paper describes the submission of Team “Giving it a Shot” to the AmericasNLP 2024 Shared Task on Creation of Educational Materials for Indigenous Languages. We use a simple few-shot prompting approach with several state of the art large language models, achieving competitive performance on the shared task, with our best system placing third overall. We perform a preliminary analysis to determine to what degree the performance of our model is due to prior exposure to the task languages, finding that generally our performance is better explained as being derived from in-context learning capabilities.
Anthology ID:
2024.americasnlp-1.20
Volume:
Proceedings of the 4th Workshop on Natural Language Processing for Indigenous Languages of the Americas (AmericasNLP 2024)
Month:
June
Year:
2024
Address:
Mexico City, Mexico
Editors:
Manuel Mager, Abteen Ebrahimi, Shruti Rijhwani, Arturo Oncevay, Luis Chiruzzo, Robert Pugh, Katharina von der Wense
Venues:
AmericasNLP | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
174–178
Language:
URL:
https://aclanthology.org/2024.americasnlp-1.20
DOI:
Bibkey:
Cite (ACL):
Coleman Haley. 2024. The unreasonable effectiveness of large language models for low-resource clause-level morphology: In-context generalization or prior exposure?. In Proceedings of the 4th Workshop on Natural Language Processing for Indigenous Languages of the Americas (AmericasNLP 2024), pages 174–178, Mexico City, Mexico. Association for Computational Linguistics.
Cite (Informal):
The unreasonable effectiveness of large language models for low-resource clause-level morphology: In-context generalization or prior exposure? (Haley, AmericasNLP-WS 2024)
Copy Citation:
PDF:
https://preview.aclanthology.org/jeptaln-2024-ingestion/2024.americasnlp-1.20.pdf
Supplementary material:
 2024.americasnlp-1.20.SupplementaryMaterial.zip
Supplementary material:
 2024.americasnlp-1.20.SupplementaryMaterial.zip