Inverse is Better! Fast and Accurate Prompt for Few-shot Slot Tagging

Yutai Hou, Cheng Chen, Xianzhen Luo, Bohan Li, Wanxiang Che


Abstract
Prompting methods recently achieve impressive success in few-shot learning. These methods modify input samples with prompt sentence pieces, and decode label tokens to map samples to corresponding labels. However, such a paradigm is very inefficient for the task of slot tagging. Since slot tagging samples are multiple consecutive words in a sentence, the prompting methods have to enumerate all n-grams token spans to find all the possible slots, which greatly slows down the prediction. To tackle this, we introduce an inverse paradigm for prompting. Different from the classic prompts mapping tokens to labels, we reversely predict slot values given slot types. Such inverse prompting only requires a one-turn prediction for each slot type and greatly speeds up the prediction. Besides, we propose a novel Iterative Prediction Strategy, from which the model learns to refine predictions by considering the relations between different slot types. We find, somewhat surprisingly, the proposed method not only predicts faster but also significantly improves the effect (improve over 6.1 F1-scores on 10-shot setting) and achieves new state-of-the-art performance.
Anthology ID:
2022.findings-acl.53
Volume:
Findings of the Association for Computational Linguistics: ACL 2022
Month:
May
Year:
2022
Address:
Dublin, Ireland
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
637–647
Language:
URL:
https://aclanthology.org/2022.findings-acl.53
DOI:
10.18653/v1/2022.findings-acl.53
Bibkey:
Cite (ACL):
Yutai Hou, Cheng Chen, Xianzhen Luo, Bohan Li, and Wanxiang Che. 2022. Inverse is Better! Fast and Accurate Prompt for Few-shot Slot Tagging. In Findings of the Association for Computational Linguistics: ACL 2022, pages 637–647, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Inverse is Better! Fast and Accurate Prompt for Few-shot Slot Tagging (Hou et al., Findings 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-script-update/2022.findings-acl.53.pdf
Code
 atmahou/promptslottagging