Abstract
E-commerce stores collect customer feedback to let sellers learn about customer concerns and enhance customer order experience. Because customer feedback often contains redundant information, a concise summary of the feedback can be generated to help sellers better understand the issues causing customer dissatisfaction. Previous state-of-the-art abstractive text summarization models make two major types of factual errors when producing summaries from customer feedback, which are wrong entity detection (WED) and incorrect product-defect description (IPD). In this work, we introduce a set of methods to enhance the factual consistency of abstractive summarization on customer feedback. We augment the training data with artificially corrupted summaries, and use them as counterparts of the target summaries. We add a contrastive loss term into the training objective so that the model learns to avoid certain factual errors. Evaluation results show that a large portion of WED and IPD errors are alleviated for BART and T5. Furthermore, our approaches do not depend on the structure of the summarization model and thus are generalizable to any abstractive summarization systems.- Anthology ID:
- 2021.ecnlp-1.19
- Volume:
- Proceedings of the 4th Workshop on e-Commerce and NLP
- Month:
- August
- Year:
- 2021
- Address:
- Online
- Editors:
- Shervin Malmasi, Surya Kallumadi, Nicola Ueffing, Oleg Rokhlenko, Eugene Agichtein, Ido Guy
- Venue:
- ECNLP
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 158–163
- Language:
- URL:
- https://aclanthology.org/2021.ecnlp-1.19
- DOI:
- 10.18653/v1/2021.ecnlp-1.19
- Cite (ACL):
- Yang Liu, Yifei Sun, and Vincent Gao. 2021. Improving Factual Consistency of Abstractive Summarization on Customer Feedback. In Proceedings of the 4th Workshop on e-Commerce and NLP, pages 158–163, Online. Association for Computational Linguistics.
- Cite (Informal):
- Improving Factual Consistency of Abstractive Summarization on Customer Feedback (Liu et al., ECNLP 2021)
- PDF:
- https://preview.aclanthology.org/emnlp22-frontmatter/2021.ecnlp-1.19.pdf