BrainPredict: a Tool for Predicting and Visualising Local Brain Activity

Youssef Hmamouche, Laurent Prévot, Magalie Ochs, Thierry Chaminade


Abstract
In this paper, we present a tool allowing dynamic prediction and visualization of an individual’s local brain activity during a conversation. The prediction module of this tool is based on classifiers trained using a corpus of human-human and human-robot conversations including fMRI recordings. More precisely, the module takes as input behavioral features computed from raw data, mainly the participant and the interlocutor speech but also the participant’s visual input and eye movements. The visualisation module shows in real-time the dynamics of brain active areas synchronised with the behavioral raw data. In addition, it shows which integrated behavioral features are used to predict the activity in individual brain areas.
Anthology ID:
2020.lrec-1.89
Volume:
Proceedings of the Twelfth Language Resources and Evaluation Conference
Month:
May
Year:
2020
Address:
Marseille, France
Venue:
LREC
SIG:
Publisher:
European Language Resources Association
Note:
Pages:
710–716
Language:
English
URL:
https://aclanthology.org/2020.lrec-1.89
DOI:
Bibkey:
Cite (ACL):
Youssef Hmamouche, Laurent Prévot, Magalie Ochs, and Thierry Chaminade. 2020. BrainPredict: a Tool for Predicting and Visualising Local Brain Activity. In Proceedings of the Twelfth Language Resources and Evaluation Conference, pages 710–716, Marseille, France. European Language Resources Association.
Cite (Informal):
BrainPredict: a Tool for Predicting and Visualising Local Brain Activity (Hmamouche et al., LREC 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/auto-file-uploads/2020.lrec-1.89.pdf