Dense Retrieval with Quantity Comparison Intent

Prayas Agrawal, Nandeesh Kumar K M, Muthusamy Chelliah, Surender Kumar, Soumen Chakrabarti


Abstract
Pre-trained language models (PLMs) fragment numerals and units that express quantities in arbitrary ways, depending on their subword vocabulary. Consequently, they are unable to contextualize the fragment embeddings well enough to be proficient with dense retrieval in domains like e-commerce and finance. Arithmetic inequality constraints (“laptop under 2 lb”) offer additional challenges. In response, we propose DeepQuant, a dense retrieval system built around a dense multi-vector index, but carefully engineered to elicit and exploit quantities and associated comparison intents. A novel component of our relevance score compares two quantities with compatible units, conditioned on a proposed comparison operator. The uncertain extractions of numerals, units and comparators are marginalized in a suitable manner. On two public and one proprietary e-commerce benchmark, DeepQuant is both faster and more accurate than popular PLMs. It also beats several competitive sparse and dense retrieval systems that do not take special cognizance of quantities.
Anthology ID:
2025.findings-acl.1220
Volume:
Findings of the Association for Computational Linguistics: ACL 2025
Month:
July
Year:
2025
Address:
Vienna, Austria
Editors:
Wanxiang Che, Joyce Nabende, Ekaterina Shutova, Mohammad Taher Pilehvar
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
23825–23839
Language:
URL:
https://preview.aclanthology.org/display_plenaries/2025.findings-acl.1220/
DOI:
Bibkey:
Cite (ACL):
Prayas Agrawal, Nandeesh Kumar K M, Muthusamy Chelliah, Surender Kumar, and Soumen Chakrabarti. 2025. Dense Retrieval with Quantity Comparison Intent. In Findings of the Association for Computational Linguistics: ACL 2025, pages 23825–23839, Vienna, Austria. Association for Computational Linguistics.
Cite (Informal):
Dense Retrieval with Quantity Comparison Intent (Agrawal et al., Findings 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/display_plenaries/2025.findings-acl.1220.pdf