Forewarned Is Forearmed: When Non-Sequential Embedding Turns into an Anomaly Detector

Elys Allesiardo, Antoine Caubrière, Valentin Vielzeuf


Abstract
This paper offers an in-depth analysis of non-sequential multimodal sentence-level embeddings, with a particular focus on the SONAR model. We demonstrate that certain embedding dimensions are sensitive to perturbations and can serve as indicators of decoding anomalies. By leveraging the consistency between successive encoding and decoding, we successfully build an accurate detector. Additionally, we explore modifying specific dimensions of interest to attempt to correct them. This work underscores the importance of understanding and analyzing the embeddings themselves to enhance the reliability of multimodal representations.
Anthology ID:
2026.lrec-main.797
Volume:
Proceedings of the Fifteenth Language Resources and Evaluation Conference
Month:
May
Year:
2026
Address:
Palma de Mallorca, Spain
Editors:
Stelios Piperidis, Núria Bel, Henk van den Heuvel, Nancy Ide, Simon Krek, Antonio Toral
Venue:
LREC
SIG:
Publisher:
ELRA Language Resource Association
Note:
Pages:
10150–10156
Language:
URL:
https://preview.aclanthology.org/ingest-lrec/2026.lrec-main.797/
DOI:
Bibkey:
Cite (ACL):
Elys Allesiardo, Antoine Caubrière, and Valentin Vielzeuf. 2026. Forewarned Is Forearmed: When Non-Sequential Embedding Turns into an Anomaly Detector. International Conference on Language Resources and Evaluation, main:10150–10156.
Cite (Informal):
Forewarned Is Forearmed: When Non-Sequential Embedding Turns into an Anomaly Detector (Allesiardo et al., LREC 2026)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-lrec/2026.lrec-main.797.pdf