High-quality argumentative information in low resources approaches improve counter-narrative generation

Damián Furman, Pablo Torres, José Rodríguez, Diego Letzen, Maria Martinez, Laura Alemany


Abstract
It has been shown that high quality fine-tuning boosts the performance of language models, even if the size of the fine-tuning is small. In this work we show how highly targeted fine-tuning improves the task of hate speech counter-narrative generation in user-generated text, even for very small sizes of training (1722 counter-narratives for English and 355 for Spanish). Providing a small subset of examples focusing on single argumentative strategies, together with the argumentative analysis relevant to that strategy, yields counter-narratives that are as satisfactory as providing the whole set of counter-narratives. We also show that a good base model is required for the fine-tuning to have a positive impact. Indeed, for Spanish, the counter-narratives obtained without fine-tuning are mostly unacceptable, and, while fine-tuning improves their overall quality, the performance still remains quite unsatisfactory.
Anthology ID:
2023.findings-emnlp.194
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2942–2956
Language:
URL:
https://aclanthology.org/2023.findings-emnlp.194
DOI:
10.18653/v1/2023.findings-emnlp.194
Bibkey:
Cite (ACL):
Damián Furman, Pablo Torres, José Rodríguez, Diego Letzen, Maria Martinez, and Laura Alemany. 2023. High-quality argumentative information in low resources approaches improve counter-narrative generation. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 2942–2956, Singapore. Association for Computational Linguistics.
Cite (Informal):
High-quality argumentative information in low resources approaches improve counter-narrative generation (Furman et al., Findings 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/dois-2013-emnlp/2023.findings-emnlp.194.pdf