Fine-Tuning Pre-Trained Language Models with Gaze Supervision

Shuwen Deng, Paul Prasse, David Reich, Tobias Scheffer, Lena Jäger


Abstract
Human gaze data provide cognitive information that reflect human language comprehension and has been effectively integrated into a variety of natural language processing (NLP) tasks, demonstrating improved performance over corresponding plain text-based models. In this work, we propose to integrate a gaze module into pre-trained language models (LMs) at the fine-tuning stage to improve their capabilities to learn representations that are grounded in human language processing. This is done by extending the conventional purely text-based fine-tuning objective with an auxiliary loss to exploit cognitive signals. The gaze module is only included during training, retaining compatibility with existing pre-trained LM-based pipelines. We evaluate the proposed approach using two distinct pre-trained LMs on the GLUE benchmark and observe that the proposed model improves performance compared to both standard fine-tuning and traditional text augmentation baselines.
Anthology ID:
2024.acl-short.21
Volume:
Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers)
Month:
August
Year:
2024
Address:
Bangkok, Thailand
Editors:
Lun-Wei Ku, Andre Martins, Vivek Srikumar
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
217–224
Language:
URL:
https://aclanthology.org/2024.acl-short.21
DOI:
10.18653/v1/2024.acl-short.21
Bibkey:
Cite (ACL):
Shuwen Deng, Paul Prasse, David Reich, Tobias Scheffer, and Lena Jäger. 2024. Fine-Tuning Pre-Trained Language Models with Gaze Supervision. In Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pages 217–224, Bangkok, Thailand. Association for Computational Linguistics.
Cite (Informal):
Fine-Tuning Pre-Trained Language Models with Gaze Supervision (Deng et al., ACL 2024)
Copy Citation:
PDF:
https://preview.aclanthology.org/dois-2013-emnlp/2024.acl-short.21.pdf