IIT (BHU) Varanasi at MSR-SRST 2018: A Language Model Based Approach for Natural Language Generation

Shreyansh Singh, Ayush Sharma, Avi Chawla, A.K. Singh


Abstract
This paper describes our submission system for the Shallow Track of Surface Realization Shared Task 2018 (SRST’18). The task was to convert genuine UD structures, from which word order information had been removed and the tokens had been lemmatized, into their correct sentential form. We divide the problem statement into two parts, word reinflection and correct word order prediction. For the first sub-problem, we use a Long Short Term Memory based Encoder-Decoder approach. For the second sub-problem, we present a Language Model (LM) based approach. We apply two different sub-approaches in the LM Based approach and the combined result of these two approaches is considered as the final output of the system.
Anthology ID:
W18-3603
Volume:
Proceedings of the First Workshop on Multilingual Surface Realisation
Month:
July
Year:
2018
Address:
Melbourne, Australia
Editors:
Simon Mille, Anja Belz, Bernd Bohnet, Emily Pitler, Leo Wanner
Venue:
ACL
SIG:
SIGGEN
Publisher:
Association for Computational Linguistics
Note:
Pages:
29–34
Language:
URL:
https://aclanthology.org/W18-3603
DOI:
10.18653/v1/W18-3603
Bibkey:
Cite (ACL):
Shreyansh Singh, Ayush Sharma, Avi Chawla, and A.K. Singh. 2018. IIT (BHU) Varanasi at MSR-SRST 2018: A Language Model Based Approach for Natural Language Generation. In Proceedings of the First Workshop on Multilingual Surface Realisation, pages 29–34, Melbourne, Australia. Association for Computational Linguistics.
Cite (Informal):
IIT (BHU) Varanasi at MSR-SRST 2018: A Language Model Based Approach for Natural Language Generation (Singh et al., ACL 2018)
Copy Citation:
PDF:
https://preview.aclanthology.org/naacl24-info/W18-3603.pdf
Code
 shreyansh26/SRST-18
Data
Universal Dependencies