Hierarchical User Intent Inference with Knowledge Graph Grounding

Tzu-Cheng Peng, Chien Chin Chen, Yung-Chun Chang


Abstract
Understanding user intent in online reviews requires modeling not only explicit aspect ratings but also implicit motivations shaped by contextual factors. Existing large language models (LLMs) often lack structured grounding, fail to capture nuanced intent expression. We propose HII-KG, a two-stage Hierarchical Intent Inference framework that first predicts fine-grained aspect ratings and then generates natural language intent statements, guided by contextual subgraphs retrieved from a domain-specific knowledge graph (KG). We first employ parameter-efficient fine-tuning of LLaMA3.1-8B to predict aspect ratings in an instruction-based format. Moreover, we leverage Cypher-aware prompting to generate user intent from KG summaries. Experiments on a online hotel review dataset show that HII-KG consistently outperforms strong LLM and encoder-based baselines in both aspect classification (avg. F1 +4.5%) and intent generation (BLEU +3.3, ROUGE-L +2.9). The results demonstrate that structured KG integration can significantly enhance fluency, contextual relevance, and factual alignment in user intent inference.
Anthology ID:
2026.findings-eacl.285
Volume:
Findings of the Association for Computational Linguistics: EACL 2026
Month:
March
Year:
2026
Address:
Rabat, Morocco
Editors:
Vera Demberg, Kentaro Inui, Lluís Marquez
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5371–5377
Language:
URL:
https://preview.aclanthology.org/ingest-eacl/2026.findings-eacl.285/
DOI:
Bibkey:
Cite (ACL):
Tzu-Cheng Peng, Chien Chin Chen, and Yung-Chun Chang. 2026. Hierarchical User Intent Inference with Knowledge Graph Grounding. In Findings of the Association for Computational Linguistics: EACL 2026, pages 5371–5377, Rabat, Morocco. Association for Computational Linguistics.
Cite (Informal):
Hierarchical User Intent Inference with Knowledge Graph Grounding (Peng et al., Findings 2026)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-eacl/2026.findings-eacl.285.pdf
Checklist:
 2026.findings-eacl.285.checklist.pdf