Learning Language through Grounding

Freda Shi, Ziqiao Ma, Jiayuan Mao, Parisa Kordjamshidi, Joyce Chai


Abstract
Grounding has been a long-standing concept in natural language processing (NLP) and computational linguistics (CL). This tutorial provides a historical overview and introduces recent advances in learning language through grounding, with a particular emphasis on the latter. We will begin by tracing the history of grounding and presenting a unified perspective on the term. In Parts II to IV, we will delve into recent progress in learning lexical semantics, syntax, and complex meanings through various forms of grounding. We will conclude by discussing future directions and open challenges, particularly those related to the growing trend of large language models and scaling.
Anthology ID:
2025.naacl-tutorial.6
Volume:
Proceedings of the 2025 Annual Conference of the Nations of the Americas Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 5: Tutorial Abstracts)
Month:
May
Year:
2025
Address:
Albuquerque, New Mexico
Editors:
Maria Lomeli, Swabha Swayamdipta, Rui Zhang
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
38–43
Language:
URL:
https://preview.aclanthology.org/fix-sig-urls/2025.naacl-tutorial.6/
DOI:
Bibkey:
Cite (ACL):
Freda Shi, Ziqiao Ma, Jiayuan Mao, Parisa Kordjamshidi, and Joyce Chai. 2025. Learning Language through Grounding. In Proceedings of the 2025 Annual Conference of the Nations of the Americas Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 5: Tutorial Abstracts), pages 38–43, Albuquerque, New Mexico. Association for Computational Linguistics.
Cite (Informal):
Learning Language through Grounding (Shi et al., NAACL 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/fix-sig-urls/2025.naacl-tutorial.6.pdf