Improving Sequence-to-Sequence Pre-training via Sequence Span Rewriting

Wangchunshu Zhou, Tao Ge, Canwen Xu, Ke Xu, Furu Wei


Abstract
In this paper, we propose Sequence Span Rewriting (SSR), a self-supervised task for sequence-to-sequence (Seq2Seq) pre-training. SSR learns to refine the machine-generated imperfect text spans into ground truth text. SSR provides more fine-grained and informative supervision in addition to the original text-infilling objective. Compared to the prevalent text infilling objectives for Seq2Seq pre-training, SSR is naturally more consistent with many downstream generation tasks that require sentence rewriting (e.g., text summarization, question generation, grammatical error correction, and paraphrase generation). We conduct extensive experiments by using SSR to improve the typical Seq2Seq pre-trained model T5 in a continual pre-training setting and show substantial improvements over T5 on various natural language generation tasks.
Anthology ID:
2021.emnlp-main.45
Volume:
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2021
Address:
Online and Punta Cana, Dominican Republic
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
571–582
Language:
URL:
https://aclanthology.org/2021.emnlp-main.45
DOI:
10.18653/v1/2021.emnlp-main.45
Bibkey:
Cite (ACL):
Wangchunshu Zhou, Tao Ge, Canwen Xu, Ke Xu, and Furu Wei. 2021. Improving Sequence-to-Sequence Pre-training via Sequence Span Rewriting. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 571–582, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Improving Sequence-to-Sequence Pre-training via Sequence Span Rewriting (Zhou et al., EMNLP 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/update-css-js/2021.emnlp-main.45.pdf
Code
 michaelzhouwang/sequence_span_rewriting