How DDAIR you? Disambiguated Data Augmentation for Intent Recognition

Galo Castillo-López, Alexis Lombard, Nasredine Semmar, Gaël de Chalendar


Abstract
Large Language Models (LLMs) are effective for data augmentation in classification tasks like intent detection. In some cases, they inadvertently produce examples that are ambiguous with regard to untargeted classes. We present DDAIR (Disambiguated Data Augmentation for Intent Recognition) to mitigate this problem. We use Sentence Transformers to detect ambiguous class-guided augmented examples generated by LLMs for intent recognition in low-resource scenarios. We identify synthetic examples that are semantically more similar to another intent than to their target one. We also provide an iterative re-generation method to mitigate such ambiguities. Our findings show that sentence embeddings effectively help to (re)generate less ambiguous examples, and suggest promising potential to improve classification performance in scenarios where intents are loosely or broadly defined.
Anthology ID:
2026.eacl-short.20
Volume:
Proceedings of the 19th Conference of the European Chapter of the Association for Computational Linguistics (Volume 2: Short Papers)
Month:
March
Year:
2026
Address:
Rabat, Morocco
Editors:
Vera Demberg, Kentaro Inui, Lluís Marquez
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
274–286
Language:
URL:
https://preview.aclanthology.org/ingest-eacl/2026.eacl-short.20/
DOI:
Bibkey:
Cite (ACL):
Galo Castillo-López, Alexis Lombard, Nasredine Semmar, and Gaël de Chalendar. 2026. How DDAIR you? Disambiguated Data Augmentation for Intent Recognition. In Proceedings of the 19th Conference of the European Chapter of the Association for Computational Linguistics (Volume 2: Short Papers), pages 274–286, Rabat, Morocco. Association for Computational Linguistics.
Cite (Informal):
How DDAIR you? Disambiguated Data Augmentation for Intent Recognition (Castillo-López et al., EACL 2026)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-eacl/2026.eacl-short.20.pdf
Checklist:
 2026.eacl-short.20.checklist.pdf