Empowering Conversational Agents using Semantic In-Context Learning

Amin Omidvar, Aijun An


Abstract
Language models are one of the biggest game changers in downstream NLP applications, especially in conversational agents. In spite of their awesome capabilities to generated responses to solve the inquireis, there are still some big challenges to using them. One challenge is how to enable the LLMs to use the private internal data to solve inquires. And secondly, how to keep the LLMs updated with newly incoming data without the burden of fine-tuning as it is not only expensive but also not an available option for some commercial LLMs, such as ChatGPT. In this work, we propose Semantic In-Context Learning (S-ICL) to address the aforementioned challenges. Our approach was participated in the BEA 2023 shared task and ended up having the fourth place in both development and evaluation phases.
Anthology ID:
2023.bea-1.62
Volume:
Proceedings of the 18th Workshop on Innovative Use of NLP for Building Educational Applications (BEA 2023)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Ekaterina Kochmar, Jill Burstein, Andrea Horbach, Ronja Laarmann-Quante, Nitin Madnani, Anaïs Tack, Victoria Yaneva, Zheng Yuan, Torsten Zesch
Venue:
BEA
SIG:
SIGEDU
Publisher:
Association for Computational Linguistics
Note:
Pages:
766–771
Language:
URL:
https://aclanthology.org/2023.bea-1.62
DOI:
10.18653/v1/2023.bea-1.62
Bibkey:
Cite (ACL):
Amin Omidvar and Aijun An. 2023. Empowering Conversational Agents using Semantic In-Context Learning. In Proceedings of the 18th Workshop on Innovative Use of NLP for Building Educational Applications (BEA 2023), pages 766–771, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Empowering Conversational Agents using Semantic In-Context Learning (Omidvar & An, BEA 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-2/2023.bea-1.62.pdf