Joël Legrand


2022

pdf
Controllable Sentence Simplification via Operation Classification
Liam Cripwell | Joël Legrand | Claire Gardent
Findings of the Association for Computational Linguistics: NAACL 2022

Different types of transformations have been used to model sentence simplification ranging from mainly local operations such as phrasal or lexical rewriting, deletion and re-ordering to the more global affecting the whole input sentence such as sentence rephrasing, copying and splitting. In this paper, we propose a novel approach to sentence simplification which encompasses four global operations: whether to rephrase or copy and whether to split based on syntactic or discourse structure. We create a novel dataset that can be used to train highly accurate classification systems for these four operations. We propose a controllable-simplification model that tailors simplifications to these operations and show that it outperforms both end-to-end, non-controllable approaches and previous controllable approaches.

2021

pdf
Discourse-Based Sentence Splitting
Liam Cripwell | Joël Legrand | Claire Gardent
Findings of the Association for Computational Linguistics: EMNLP 2021

Sentence splitting involves the segmentation of a sentence into two or more shorter sentences. It is a key component of sentence simplification, has been shown to help human comprehension and is a useful preprocessing step for NLP tasks such as summarisation and relation extraction. While several methods and datasets have been proposed for developing sentence splitting models, little attention has been paid to how sentence splitting interacts with discourse structure. In this work, we focus on cases where the input text contains a discourse connective, which we refer to as discourse-based sentence splitting. We create synthetic and organic datasets for discourse-based splitting and explore different ways of combining these datasets using different model architectures. We show that pipeline models which use discourse structure to mediate sentence splitting outperform end-to-end models in learning the various ways of expressing a discourse relation but generate text that is less grammatical; that large scale synthetic data provides a better basis for learning than smaller scale organic data; and that training on discourse-focused, rather than on general sentence splitting data provides a better basis for discourse splitting.

2018

pdf
Syntax-based Transfer Learning for the Task of Biomedical Relation Extraction
Joël Legrand | Yannick Toussaint | Chedy Raïssi | Adrien Coulet
Proceedings of the Ninth International Workshop on Health Text Mining and Information Analysis

Transfer learning (TL) proposes to enhance machine learning performance on a problem, by reusing labeled data originally designed for a related problem. In particular, domain adaptation consists, for a specific task, in reusing training data developed for the same task but a distinct domain. This is particularly relevant to the applications of deep learning in Natural Language Processing, because those usually require large annotated corpora that may not exist for the targeted domain, but exist for side domains. In this paper, we experiment with TL for the task of Relation Extraction (RE) from biomedical texts, using the TreeLSTM model. We empirically show the impact of TreeLSTM alone and with domain adaptation by obtaining better performances than the state of the art on two biomedical RE tasks and equal performances for two others, for which few annotated data are available. Furthermore, we propose an analysis of the role that syntactic features may play in TL for RE.

2016

pdf
Phrase Representations for Multiword Expressions
Joël Legrand | Ronan Collobert
Proceedings of the 12th Workshop on Multiword Expressions

pdf
Neural Network-based Word Alignment through Score Aggregation
Joël Legrand | Michael Auli | Ronan Collobert
Proceedings of the First Conference on Machine Translation: Volume 1, Research Papers

pdf
Deep Neural Networks for Syntactic Parsing of Morphologically Rich Languages
Joël Legrand | Ronan Collobert
Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers)