@inproceedings{khallaf-etal-2022-towards,
    title = "Towards {A}rabic Sentence Simplification via Classification and Generative Approaches",
    author = "Khallaf, Nouran  and
      Sharoff, Serge  and
      Soliman, Rasha",
    editor = "Bouamor, Houda  and
      Al-Khalifa, Hend  and
      Darwish, Kareem  and
      Rambow, Owen  and
      Bougares, Fethi  and
      Abdelali, Ahmed  and
      Tomeh, Nadi  and
      Khalifa, Salam  and
      Zaghouani, Wajdi",
    booktitle = "Proceedings of the Seventh Arabic Natural Language Processing Workshop (WANLP)",
    month = dec,
    year = "2022",
    address = "Abu Dhabi, United Arab Emirates (Hybrid)",
    publisher = "Association for Computational Linguistics",
    url = "https://preview.aclanthology.org/ingest-emnlp/2022.wanlp-1.5/",
    doi = "10.18653/v1/2022.wanlp-1.5",
    pages = "43--52",
    abstract = "This paper presents an attempt to build a Modern Standard Arabic (MSA) sentence-level simplification system. We experimented with sentence simplification using two approaches: (i) a classification approach leading to lexical simplification pipelines which use Arabic-BERT, a pre-trained contextualised model, as well as a model of fastText word embeddings; and (ii) a generative approach, a Seq2Seq technique by applying a multilingual Text-to-Text Transfer Transformer mT5. We developed our training corpus by aligning the original and simplified sentences from the internationally acclaimed Arabic novel Saaq al-Bambuu. We evaluate effectiveness of these methods by comparing the generated simple sentences to the target simple sentences using the BERTScore evaluation metric. The simple sentences produced by the mT5 model achieve P 0.72, R 0.68 and F-1 0.70 via BERTScore, while, combining Arabic-BERT and fastText achieves P 0.97, R 0.97 and F-1 0.97. In addition, we report a manual error analysis for these experiments."
}Markdown (Informal)
[Towards Arabic Sentence Simplification via Classification and Generative Approaches](https://preview.aclanthology.org/ingest-emnlp/2022.wanlp-1.5/) (Khallaf et al., WANLP 2022)
ACL