nclu_team at SemEval-2023 Task 6: Attention-based Approaches for Large Court Judgement Prediction with Explanation

Nicolay Rusnachenko, Thanet Markchom, Huizhi Liang


Abstract
Legal documents tend to be large in size. In this paper, we provide an experiment with attention-based approaches complemented by certain document processing techniques for judgment prediction. For the prediction of explanation, we consider this as an extractive text summarization problem based on an output of (1) CNN with attention mechanism and (2) self-attention of language models. Our extensive experiments show that treating document endings at first results in a 2.1% improvement in judgment prediction across all the models. Additional content peeling from non-informative sentences allows an improvement of explanation prediction performance by 4% in the case of attention-based CNN models. The best submissions achieved 8’th and 3’rd ranks on judgment prediction (C1) and prediction with explanation (C2) tasks respectively among 11 participating teams. The results of our experiments are published
Anthology ID:
2023.semeval-1.36
Volume:
Proceedings of the 17th International Workshop on Semantic Evaluation (SemEval-2023)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Atul Kr. Ojha, A. Seza Doğruöz, Giovanni Da San Martino, Harish Tayyar Madabushi, Ritesh Kumar, Elisa Sartori
Venue:
SemEval
SIG:
SIGLEX
Publisher:
Association for Computational Linguistics
Note:
Pages:
270–274
Language:
URL:
https://aclanthology.org/2023.semeval-1.36
DOI:
10.18653/v1/2023.semeval-1.36
Bibkey:
Cite (ACL):
Nicolay Rusnachenko, Thanet Markchom, and Huizhi Liang. 2023. nclu_team at SemEval-2023 Task 6: Attention-based Approaches for Large Court Judgement Prediction with Explanation. In Proceedings of the 17th International Workshop on Semantic Evaluation (SemEval-2023), pages 270–274, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
nclu_team at SemEval-2023 Task 6: Attention-based Approaches for Large Court Judgement Prediction with Explanation (Rusnachenko et al., SemEval 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-1/2023.semeval-1.36.pdf