Non-autoregressive Text Editing with Copy-aware Latent Alignments

Yu Zhang, Yue Zhang, Leyang Cui, Guohong Fu


Abstract
Recent work has witnessed a paradigm shift from Seq2Seq to Seq2Edit in the field of text editing, with the aim of addressing the slow autoregressive inference problem posed by the former. Despite promising results, Seq2Edit approaches still face several challenges such as inflexibility in generation and difficulty in generalizing to other languages. In this work, we propose a novel non-autoregressive text editing method to circumvent the above issues, by modeling the edit process with latent CTC alignments. We make a crucial extension to CTC by introducing the copy operation into the edit space, thus enabling more efficient management of textual overlap in editing. We conduct extensive experiments on GEC and sentence fusion tasks, showing that our proposed method significantly outperforms existing Seq2Edit models and achieves similar or even better results than Seq2Seq with over speedup. Moreover, it demonstrates good generalizability on German and Russian. In-depth analyses reveal the strengths of our method in terms of the robustness under various scenarios and generating fluent and flexible outputs.
Anthology ID:
2023.emnlp-main.437
Volume:
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
7075–7085
Language:
URL:
https://aclanthology.org/2023.emnlp-main.437
DOI:
10.18653/v1/2023.emnlp-main.437
Bibkey:
Cite (ACL):
Yu Zhang, Yue Zhang, Leyang Cui, and Guohong Fu. 2023. Non-autoregressive Text Editing with Copy-aware Latent Alignments. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 7075–7085, Singapore. Association for Computational Linguistics.
Cite (Informal):
Non-autoregressive Text Editing with Copy-aware Latent Alignments (Zhang et al., EMNLP 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-3/2023.emnlp-main.437.pdf
Video:
 https://preview.aclanthology.org/nschneid-patch-3/2023.emnlp-main.437.mp4