Abstract
Generalized quantifiers (e.g., few, most) are used to indicate the proportions predicates satisfy (for example, some apples are red). One way to interpret quantifier semantics is to explicitly bind these satisfactions with percentage scopes (e.g., 30%-40% of apples are red). This approach can be helpful for tasks like logic formalization and surface-form quantitative reasoning (Gordon and Schubert, 2010; Roy et al., 2015). However, it remains unclear if recent foundation models (Bommasani et al., 2021) possess this ability due to the absence of direct training signals. To explore this, we introduce QuRe, a crowd-sourced dataset of human-annotated generalized quantifiers in Wikipedia sentences featuring percentage-equipped predicates. We explore quantifier comprehension using PRESQUE, a framework that combines natural language inference and the Rational Speech Acts framework. Experimental results on the HVD dataset (Herbelot and Vecchi, 2015) and QuRe demonstrate PRESQUE’s superiority over a literal listener baseline, showing a 20% relative improvement in F1 in predicting percentage scopes for quantifiers, even with no additional training.- Anthology ID:
- 2023.emnlp-main.38
- Volume:
- Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
- Month:
- December
- Year:
- 2023
- Address:
- Singapore
- Editors:
- Houda Bouamor, Juan Pino, Kalika Bali
- Venue:
- EMNLP
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 573–591
- Language:
- URL:
- https://aclanthology.org/2023.emnlp-main.38
- DOI:
- 10.18653/v1/2023.emnlp-main.38
- Cite (ACL):
- Yiyuan Li, Rakesh Menon, Sayan Ghosh, and Shashank Srivastava. 2023. Pragmatic Reasoning Unlocks Quantifier Semantics for Foundation Models. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 573–591, Singapore. Association for Computational Linguistics.
- Cite (Informal):
- Pragmatic Reasoning Unlocks Quantifier Semantics for Foundation Models (Li et al., EMNLP 2023)
- PDF:
- https://preview.aclanthology.org/fix-volume-bibkeys/2023.emnlp-main.38.pdf