Enhancing Sentence Simplification in Portuguese: Leveraging Paraphrases, Context, and Linguistic Features

Arthur Scalercio, Maria Finatto, Aline Paes


Abstract
Automatic text simplification focuses on transforming texts into a more comprehensible version without sacrificing their precision. However, automatic methods usually require (paired) datasets that can be rather scarce in languages other than English. This paper presents a new approach to automatic sentence simplification that leverages paraphrases, context, and linguistic attributes to overcome the absence of paired texts in Portuguese.We frame the simplification problem as a textual style transfer task and learn a style representation using the sentences around the target sentence in the document and its linguistic attributes. Moreover, unlike most unsupervised approaches that require style-labeled training data, we fine-tune strong pre-trained models using sentence-level paraphrases instead of annotated data. Our experiments show that our model achieves remarkable results, surpassing the current state-of-the-art (BART+ACCESS) while competitively matching a Large Language Model.
Anthology ID:
2024.findings-acl.895
Volume:
Findings of the Association for Computational Linguistics: ACL 2024
Month:
August
Year:
2024
Address:
Bangkok, Thailand
Editors:
Lun-Wei Ku, Andre Martins, Vivek Srikumar
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
15076–15091
Language:
URL:
https://aclanthology.org/2024.findings-acl.895
DOI:
10.18653/v1/2024.findings-acl.895
Bibkey:
Cite (ACL):
Arthur Scalercio, Maria Finatto, and Aline Paes. 2024. Enhancing Sentence Simplification in Portuguese: Leveraging Paraphrases, Context, and Linguistic Features. In Findings of the Association for Computational Linguistics: ACL 2024, pages 15076–15091, Bangkok, Thailand. Association for Computational Linguistics.
Cite (Informal):
Enhancing Sentence Simplification in Portuguese: Leveraging Paraphrases, Context, and Linguistic Features (Scalercio et al., Findings 2024)
Copy Citation:
PDF:
https://preview.aclanthology.org/autopr/2024.findings-acl.895.pdf