Retrieval-based Language Models and Applications

Akari Asai, Sewon Min, Zexuan Zhong, Danqi Chen


Abstract
Retrieval-based language models (LMs) have shown impressive performance on diverse NLP tasks. In this tutorial, we will provide a comprehensive and coherent overview of recent advances in retrieval-based LMs. We will start by providing preliminaries covering the foundation of LMs (e.g., masked LMs, autoregressive LMs) and retrieval systems (e.g., nearest-neighbor search). We will then detail recent progress in retrieval-based models, focusing on their model architectures and learning approaches. Finally, we will show how retrieval-based LMs are adapted to downstream applications, and extended to multilingual and multi-modal settings. Finally, we will use an exercise to showcase the effectiveness of retrieval-based LMs.
Anthology ID:
2023.acl-tutorials.6
Volume:
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 6: Tutorial Abstracts)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Yun-Nung (Vivian) Chen, Margot Margot, Siva Reddy
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
41–46
Language:
URL:
https://aclanthology.org/2023.acl-tutorials.6
DOI:
10.18653/v1/2023.acl-tutorials.6
Bibkey:
Cite (ACL):
Akari Asai, Sewon Min, Zexuan Zhong, and Danqi Chen. 2023. Retrieval-based Language Models and Applications. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 6: Tutorial Abstracts), pages 41–46, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Retrieval-based Language Models and Applications (Asai et al., ACL 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/dois-2013-emnlp/2023.acl-tutorials.6.pdf