Justyna Grudzińska


2022

pdf
Prepositions Matter in Quantifier Scope Disambiguation
Aleksander Leczkowski | Justyna Grudzińska | Manuel Vargas Guzmán | Aleksander Wawer | Aleksandra Siemieniuk
Proceedings of the 29th International Conference on Computational Linguistics

Although it is widely agreed that world knowledge plays a significant role in quantifier scope disambiguation (QSD), there has been only very limited work on how to integrate this knowledge into a QSD model. This paper contributes to this scarce line of research by incorporating into a machine learning model our knowledge about relations, as conveyed by a manageable closed class of function words: prepositions. For data, we use a scope-disambiguated corpus created by AnderBois, Brasoveanu and Henderson, which is additionally annotated with prepositional senses using Schneider et al’s Semantic Network of Adposition and Case Supersenses (SNACS) scheme. By applying Manshadi and Allen’s method to the corpus, we were able to inspect the information gain provided by prepositions for the QSD task. Statistical analysis of the performance of the classifiers, trained in scenarios with and without preposition information, supports the claim that prepositional senses have a strong positive impact on the learnability of automatic QSD systems.

2014

pdf bib
System with Generalized Quantifiers on Dependent Types for Anaphora
Justyna Grudzińska | Marek Zawadowski
Proceedings of the EACL 2014 Workshop on Type Theory and Natural Language Semantics (TTNLS)