ProMALex: Progressive Modular Adapters for Multi-Jurisdictional Legal Language Modeling

Santosh T.y.s.s, Mohamed Hesham Elganayni


Abstract
This paper addresses the challenge of adapting language models to the jurisdiction-specific nature of legal corpora. Existing approaches—training separate models for each jurisdiction or using a single shared model—either fail to leverage common legal principles beneficial for low-resource settings or risk negative interference from conflicting jurisdictional interpretations. To overcome these limitations, we propose a parameter-efficient framework ProMALex, that first derives hierarchical relationships across jurisdictions and progressively inserts adapter modules across model layers based on jurisdictional similarity. This design allows modules in lower layers to be shared across jurisdictions, capturing common legal principles, while higher layers specialize through jurisdiction-specific adapters. Experimental results on two legal language modeling benchmarks demonstrate that ProMALex outperforms both fully shared and jurisdiction-specific models.
Anthology ID:
2025.acl-long.1080
Volume:
Proceedings of the 63rd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2025
Address:
Vienna, Austria
Editors:
Wanxiang Che, Joyce Nabende, Ekaterina Shutova, Mohammad Taher Pilehvar
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
22201–22217
Language:
URL:
https://preview.aclanthology.org/ingestion-acl-25/2025.acl-long.1080/
DOI:
Bibkey:
Cite (ACL):
Santosh T.y.s.s and Mohamed Hesham Elganayni. 2025. ProMALex: Progressive Modular Adapters for Multi-Jurisdictional Legal Language Modeling. In Proceedings of the 63rd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 22201–22217, Vienna, Austria. Association for Computational Linguistics.
Cite (Informal):
ProMALex: Progressive Modular Adapters for Multi-Jurisdictional Legal Language Modeling (T.y.s.s & Elganayni, ACL 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-acl-25/2025.acl-long.1080.pdf