Incorporating Behavioral Hypotheses for Query Generation

Ruey-Cheng Chen, Chia-Jung Lee


Abstract
Generative neural networks have been shown effective on query suggestion. Commonly posed as a conditional generation problem, the task aims to leverage earlier inputs from users in a search session to predict queries that they will likely issue at a later time. User inputs come in various forms such as querying and clicking, each of which can imply different semantic signals channeled through the corresponding behavioral patterns. This paper induces these behavioral biases as hypotheses for query generation, where a generic encoder-decoder Transformer framework is presented to aggregate arbitrary hypotheses of choice. Our experimental results show that the proposed approach leads to significant improvements on top-k word error rate and Bert F1 Score compared to a recent BART model.
Anthology ID:
2020.emnlp-main.251
Volume:
Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)
Month:
November
Year:
2020
Address:
Online
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3105–3110
Language:
URL:
https://aclanthology.org/2020.emnlp-main.251
DOI:
10.18653/v1/2020.emnlp-main.251
Bibkey:
Cite (ACL):
Ruey-Cheng Chen and Chia-Jung Lee. 2020. Incorporating Behavioral Hypotheses for Query Generation. In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), pages 3105–3110, Online. Association for Computational Linguistics.
Cite (Informal):
Incorporating Behavioral Hypotheses for Query Generation (Chen & Lee, EMNLP 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/paclic-22-ingestion/2020.emnlp-main.251.pdf
Video:
 https://slideslive.com/38939310