Analysis and Prediction of NLP Models via Task Embeddings

Damien Sileo, Marie-Francine Moens


Abstract
Task embeddings are low-dimensional representations that are trained to capture task properties. In this paper, we propose MetaEval, a collection of 101 NLP tasks. We fit a single transformer to all MetaEval tasks jointly while conditioning it on learned embeddings. The resulting task embeddings enable a novel analysis of the space of tasks. We then show that task aspects can be mapped to task embeddings for new tasks without using any annotated examples. Predicted embeddings can modulate the encoder for zero-shot inference and outperform a zero-shot baseline on GLUE tasks. The provided multitask setup can function as a benchmark for future transfer learning research.
Anthology ID:
2022.lrec-1.67
Volume:
Proceedings of the Thirteenth Language Resources and Evaluation Conference
Month:
June
Year:
2022
Address:
Marseille, France
Editors:
Nicoletta Calzolari, Frédéric Béchet, Philippe Blache, Khalid Choukri, Christopher Cieri, Thierry Declerck, Sara Goggi, Hitoshi Isahara, Bente Maegaard, Joseph Mariani, Hélène Mazo, Jan Odijk, Stelios Piperidis
Venue:
LREC
SIG:
Publisher:
European Language Resources Association
Note:
Pages:
633–647
Language:
URL:
https://aclanthology.org/2022.lrec-1.67
DOI:
Bibkey:
Cite (ACL):
Damien Sileo and Marie-Francine Moens. 2022. Analysis and Prediction of NLP Models via Task Embeddings. In Proceedings of the Thirteenth Language Resources and Evaluation Conference, pages 633–647, Marseille, France. European Language Resources Association.
Cite (Informal):
Analysis and Prediction of NLP Models via Task Embeddings (Sileo & Moens, LREC 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-2/2022.lrec-1.67.pdf
Code
 sileod/metaeval
Data
MetaEval