Abstract
We present a multi-task learning approach to predicting semantic plausibility by leveraging 50+ adapters categorized into 17 tasks within an efficient training framework. Across four plausibility datasets in English of varying size and linguistic constructions, we compare how models provided with knowledge from a range of NLP tasks perform in contrast to models without external information. Our results show that plausibility prediction benefits from complementary knowledge (e.g., provided by syntactic tasks) are significant but non-substantial, while performance may be hurt when injecting knowledge from an unsuitable task. Similarly important, we find that knowledge transfer may be hindered by class imbalance, and demonstrate the positive yet minor effect of balancing training data, even at the expense of size.- Anthology ID:
- 2024.insights-1.18
- Volume:
- Proceedings of the Fifth Workshop on Insights from Negative Results in NLP
- Month:
- June
- Year:
- 2024
- Address:
- Mexico City, Mexico
- Editors:
- Shabnam Tafreshi, Arjun Akula, João Sedoc, Aleksandr Drozd, Anna Rogers, Anna Rumshisky
- Venues:
- insights | WS
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 154–168
- Language:
- URL:
- https://aclanthology.org/2024.insights-1.18
- DOI:
- 10.18653/v1/2024.insights-1.18
- Cite (ACL):
- Annerose Eichel and Sabine Schulte Im Walde. 2024. Multi-Task Learning with Adapters for Plausibility Prediction: Bridging the Gap or Falling into the Trenches?. In Proceedings of the Fifth Workshop on Insights from Negative Results in NLP, pages 154–168, Mexico City, Mexico. Association for Computational Linguistics.
- Cite (Informal):
- Multi-Task Learning with Adapters for Plausibility Prediction: Bridging the Gap or Falling into the Trenches? (Eichel & Schulte Im Walde, insights-WS 2024)
- PDF:
- https://preview.aclanthology.org/nschneid-patch-4/2024.insights-1.18.pdf