Abstract
The current charge prediction datasets mostly focus on single-defendant criminal cases.However, real-world criminal cases usually involve multiple defendants whose criminal facts are intertwined. In an early attempt to fill this gap, we introduce a new benchmark that encompasses legal cases involving multiple defendants, where each defendant is labeled with a charge and four types of crime elements, i.e., Object Element, Objective Element, Subject Element, and Subjective Element. Based on the dataset, we further develop an interpretable model called EJudge that incorporates crime elements and legal rules to infer charges. We observe that predicting crime charges while providing corresponding rationales benefits the interpretable AI system. Extensive experiments show that EJudge significantly surpasses state-of-the-art methods, which verify the importance of crime elements and legal rules in multi-defendant charge prediction. The source code and dataset are available at https://anonymous.4open.science/r/MCP_1-6010.- Anthology ID:
- 2024.acl-long.158
- Volume:
- Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
- Month:
- August
- Year:
- 2024
- Address:
- Bangkok, Thailand
- Editors:
- Lun-Wei Ku, Andre Martins, Vivek Srikumar
- Venue:
- ACL
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 2864–2878
- Language:
- URL:
- https://aclanthology.org/2024.acl-long.158
- DOI:
- 10.18653/v1/2024.acl-long.158
- Cite (ACL):
- Xiao Wei, Qi Xu, Hang Yu, Qian Liu, and Erik Cambria. 2024. Through the MUD: A Multi-Defendant Charge Prediction Benchmark with Linked Crime Elements. In Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 2864–2878, Bangkok, Thailand. Association for Computational Linguistics.
- Cite (Informal):
- Through the MUD: A Multi-Defendant Charge Prediction Benchmark with Linked Crime Elements (Wei et al., ACL 2024)
- PDF:
- https://preview.aclanthology.org/nschneid-patch-5/2024.acl-long.158.pdf