2021
pdf
abs
Searching for Legal Documents at Paragraph Level: Automating Label Generation and Use of an Extended Attention Mask for Boosting Neural Models of Semantic Similarity
Li Tang
|
Simon Clematide
Proceedings of the Natural Legal Language Processing Workshop 2021
Searching for legal documents is a specialized Information Retrieval task that is relevant for expert users (lawyers and their assistants) and for non-expert users. By searching previous court decisions (cases), a user can better prepare the legal reasoning of a new case. Being able to search using a natural language text snippet instead of a more artificial query could help to prevent query formulation issues. Also, if semantic similarity could be modeled beyond exact lexical matches, more relevant results can be found even if the query terms don’t match exactly. For this domain, we formulated a task to compare different ways of modeling semantic similarity at paragraph level, using neural and non-neural systems. We compared systems that encode the query and the search collection paragraphs as vectors, enabling the use of cosine similarity for results ranking. After building a German dataset for cases and statutes from Switzerland, and extracting citations from cases to statutes, we developed an algorithm for estimating semantic similarity at paragraph level, using a link-based similarity method. When evaluating different systems in this way, we find that semantic similarity modeling by neural systems can be boosted with an extended attention mask that quenches noise in the inputs.
2020
pdf
abs
UZH at SemEval-2020 Task 3: Combining BERT with WordNet Sense Embeddings to Predict Graded Word Similarity Changes
Li Tang
Proceedings of the Fourteenth Workshop on Semantic Evaluation
CoSimLex is a dataset that can be used to evaluate the ability of context-dependent word embed- dings for modeling subtle, graded changes of meaning, as perceived by humans during reading. At SemEval-2020, task 3, subtask 1 is about ”predicting the (graded) effect of context in word similarity”, using CoSimLex to quantify such a change of similarity for a pair of words, from one context to another. Here, a meaning shift is composed of two aspects, a) discrete changes observed between different word senses, and b) more subtle changes of meaning representation that are not captured in those discrete changes. Therefore, this SemEval task was designed to allow the evaluation of systems that can deal with a mix of both situations of semantic shift, as they occur in the human perception of meaning. The described system was developed to improve the BERT baseline provided with the task, by reducing distortions in the BERT semantic space, compared to the human semantic space. To this end, complementarity between 768- and 1024-dimensional BERT embeddings, and average word sense vectors were used. With this system, after some fine-tuning, the baseline performance of 0.705 (uncentered Pearson correlation with human semantic shift data from 27 annotators) was enhanced by more than 6%, to 0.7645. We hope that this work can make a contribution to further our understanding of the semantic vector space of human perception, as it can be modeled with context-dependent word embeddings in natural language processing systems.
2004
pdf
A Model of Semantic Representations Analysis for Chinese Sentences
Li Tang
|
Donghong Ji
|
Lingpeng Yang
|
Yu Nie
Proceedings of the Fourth International Conference on Language Resources and Evaluation (LREC’04)
pdf
Building a Conceptual Graph Bank for Chinese Language
Donghong Ji
|
Li Tang
|
Lingpeng Yang
Proceedings of the Fourth International Conference on Language Resources and Evaluation (LREC’04)
pdf
Document Re-ranking based on Global and Local Terms
Lingpeng Yang
|
DongHong Ji
|
Li Tang
Proceedings of the Third SIGHAN Workshop on Chinese Language Processing
pdf
A Large-Scale Semantic Structure for Chinese Sentences
Li Tang
|
Donghong Ji
|
Lingpeng Yang
Proceedings of the Third SIGHAN Workshop on Chinese Language Processing
pdf
Document Re-ranking Based on Automatically Acquired Key Terms in Chinese Information Retrieval
Lingpeng Yang
|
Donghong Ji
|
Li Tang
COLING 2004: Proceedings of the 20th International Conference on Computational Linguistics