Robustness of Demonstration-based Learning Under Limited Data Scenario

Hongxin Zhang, Yanzhe Zhang, Ruiyi Zhang, Diyi Yang


Abstract
Demonstration-based learning has shown great potential in stimulating pretrained language models’ ability under limited data scenario. Simply augmenting the input with some demonstrations can significantly improve performance on few-shot NER. However, why such demonstrations are beneficial for the learning process remains unclear since there is no explicit alignment between the demonstrations and the predictions. In this paper, we design pathological demonstrations by gradually removing intuitively useful information from the standard ones to take a deep dive of the robustness of demonstration-based sequence labeling and show that (1) demonstrations composed of random tokens still make the model a better few-shot learner; (2) the length of random demonstrations and the relevance of random tokens are the main factors affecting the performance; (3) demonstrations increase the confidence of model predictions on captured superficial patterns. We have publicly released our code at https://github.com/SALT-NLP/RobustDemo.
Anthology ID:
2022.emnlp-main.116
Volume:
Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates
Editors:
Yoav Goldberg, Zornitsa Kozareva, Yue Zhang
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1769–1782
Language:
URL:
https://aclanthology.org/2022.emnlp-main.116
DOI:
10.18653/v1/2022.emnlp-main.116
Bibkey:
Cite (ACL):
Hongxin Zhang, Yanzhe Zhang, Ruiyi Zhang, and Diyi Yang. 2022. Robustness of Demonstration-based Learning Under Limited Data Scenario. In Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, pages 1769–1782, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
Cite (Informal):
Robustness of Demonstration-based Learning Under Limited Data Scenario (Zhang et al., EMNLP 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-3/2022.emnlp-main.116.pdf
Software:
 2022.emnlp-main.116.software.zip