Soft Alignment Objectives for Robust Adaptation of Language Generation

Michal Štefánik, Marek Kadlcik, Petr Sojka


Abstract
Domain adaptation allows generative language models to address specific flaws caused by the domain shift of their application. However, the traditional adaptation by further training on in-domain data rapidly weakens the model’s ability to generalize to other domains, making the open-ended deployments of the adapted models prone to errors. This work introduces novel training objectives built upon a semantic similarity of the predicted tokens to the reference. Our results show that (1) avoiding the common assumption of a single correct prediction by constructing the training target from tokens’ semantic similarity can largely mitigate catastrophic forgetting of adaptation, while (2) preserving the adaptation in-domain quality, (3) with negligible additions to compute costs. In the broader context, the objectives grounded in a continuous token similarity pioneer the exploration of the middle ground between the efficient but naive exact-match token-level objectives and expressive but computationally- and resource-intensive sequential objectives.
Anthology ID:
2023.acl-long.492
Volume:
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
8837–8853
Language:
URL:
https://aclanthology.org/2023.acl-long.492
DOI:
10.18653/v1/2023.acl-long.492
Bibkey:
Cite (ACL):
Michal Štefánik, Marek Kadlcik, and Petr Sojka. 2023. Soft Alignment Objectives for Robust Adaptation of Language Generation. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 8837–8853, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Soft Alignment Objectives for Robust Adaptation of Language Generation (Štefánik et al., ACL 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-2024-clasp/2023.acl-long.492.pdf
Video:
 https://preview.aclanthology.org/ingest-2024-clasp/2023.acl-long.492.mp4