JUDGEBERT: Assessing Legal Meaning Preservation Between Sentences
David Beauchemin, Michelle Albert-Rochette, Richard Khoury, Pierre-Luc Déziel
Abstract
Simplifying text while preserving its meaning is a complex yet essential task, especially in sensitive domain applications like legal texts. When applied to a specialized field, like the legal domain, preservation differs significantly from its role in regular texts. This paper introduces FrJUDGE, a new dataset to assess legal meaning preservation between two legal texts. It also introduces JUDGEBERT, a novel evaluation metric designed to assess legal meaning preservation in French legal text simplification. JUDGEBERT demonstrates a superior correlation with human judgment compared to existing metrics. It also passes two crucial sanity checks, while other metrics did not: For two identical sentences, it always returns a score of 100%; on the other hand, it returns 0% for two unrelated sentences. Our findings highlight its potential to transform legal NLP applications, ensuring accuracy and accessibility for text simplification for legal practitioners and lay users.- Anthology ID:
- 2025.emnlp-main.5
- Volume:
- Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing
- Month:
- November
- Year:
- 2025
- Address:
- Suzhou, China
- Editors:
- Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
- Venue:
- EMNLP
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 92–118
- Language:
- URL:
- https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.5/
- DOI:
- Cite (ACL):
- David Beauchemin, Michelle Albert-Rochette, Richard Khoury, and Pierre-Luc Déziel. 2025. JUDGEBERT: Assessing Legal Meaning Preservation Between Sentences. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing, pages 92–118, Suzhou, China. Association for Computational Linguistics.
- Cite (Informal):
- JUDGEBERT: Assessing Legal Meaning Preservation Between Sentences (Beauchemin et al., EMNLP 2025)
- PDF:
- https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.5.pdf