Stylistic MR-to-Text Generation Using Pre-trained Language Models

Kunal Pagarey, Kanika Kalra, Abhay Garg, Saumajit Saha, Mayur Patidar, Shirish Karande


Abstract
We explore the ability of pre-trained language models BART, an encoder-decoder model, GPT2 and GPT-Neo, both decoder-only models for generating sentences from structured MR tags as input. We observe best results on several metrics for the YelpNLG and E2E datasets. Style based implicit tags such as emotion, sentiment, length etc., allows for controlled generation but it is typically not present in MR. We present an analysis on YelpNLG showing BART can express the content with stylistic variations in the structure of the sentence. Motivated with the results, we define a new task of emotional situation generation from various POS tags and emotion label values as MR using EmpatheticDialogues dataset and report a baseline. Encoder-Decoder attention analysis shows that BART learns different aspects in MR at various layers and heads.
Anthology ID:
2021.icon-main.13
Volume:
Proceedings of the 18th International Conference on Natural Language Processing (ICON)
Month:
December
Year:
2021
Address:
National Institute of Technology Silchar, Silchar, India
Editors:
Sivaji Bandyopadhyay, Sobha Lalitha Devi, Pushpak Bhattacharyya
Venue:
ICON
SIG:
Publisher:
NLP Association of India (NLPAI)
Note:
Pages:
93–99
Language:
URL:
https://aclanthology.org/2021.icon-main.13
DOI:
Bibkey:
Cite (ACL):
Kunal Pagarey, Kanika Kalra, Abhay Garg, Saumajit Saha, Mayur Patidar, and Shirish Karande. 2021. Stylistic MR-to-Text Generation Using Pre-trained Language Models. In Proceedings of the 18th International Conference on Natural Language Processing (ICON), pages 93–99, National Institute of Technology Silchar, Silchar, India. NLP Association of India (NLPAI).
Cite (Informal):
Stylistic MR-to-Text Generation Using Pre-trained Language Models (Pagarey et al., ICON 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/emnlp22-frontmatter/2021.icon-main.13.pdf
Optional supplementary material:
 2021.icon-main.13.OptionalSupplementaryMaterial.zip