Poirot at CMCL 2022 Shared Task: Zero Shot Crosslingual Eye-Tracking Data Prediction using Multilingual Transformer Models

Harshvardhan Srivastava


Abstract
Eye tracking data during reading is a useful source of information to understand the cognitive processes that take place during language comprehension processes. Different languages account for different cognitive triggers, however there seems to be some uniform indicatorsacross languages. In this paper, we describe our submission to the CMCL 2022 shared task on predicting human reading patterns for multi-lingual dataset. Our model uses text representations from transformers and some hand engineered features with a regression layer on top to predict statistical measures of mean and standard deviation for 2 main eye-tracking features. We train an end-to-end model to extract meaningful information from different languages and test our model on two separate datasets. We compare different transformer models andshow ablation studies affecting model performance. Our final submission ranked 4th place for SubTask-1 and 1st place for SubTask-2 forthe shared task.
Anthology ID:
2022.cmcl-1.11
Volume:
Proceedings of the Workshop on Cognitive Modeling and Computational Linguistics
Month:
May
Year:
2022
Address:
Dublin, Ireland
Venue:
CMCL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
102–107
Language:
URL:
https://aclanthology.org/2022.cmcl-1.11
DOI:
10.18653/v1/2022.cmcl-1.11
Bibkey:
Cite (ACL):
Harshvardhan Srivastava. 2022. Poirot at CMCL 2022 Shared Task: Zero Shot Crosslingual Eye-Tracking Data Prediction using Multilingual Transformer Models. In Proceedings of the Workshop on Cognitive Modeling and Computational Linguistics, pages 102–107, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Poirot at CMCL 2022 Shared Task: Zero Shot Crosslingual Eye-Tracking Data Prediction using Multilingual Transformer Models (Srivastava, CMCL 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-script-update/2022.cmcl-1.11.pdf
Video:
 https://preview.aclanthology.org/ingestion-script-update/2022.cmcl-1.11.mp4