Joy Mahapatra


Exploring Structural Encoding for Data-to-Text Generation
Joy Mahapatra | Utpal Garain
Proceedings of the 14th International Conference on Natural Language Generation

Due to efficient end-to-end training and fluency in generated texts, several encoder-decoder framework-based models are recently proposed for data-to-text generations. Appropriate encoding of input data is a crucial part of such encoder-decoder models. However, only a few research works have concentrated on proper encoding methods. This paper presents a novel encoder-decoder based data-to-text generation model where the proposed encoder carefully encodes input data according to underlying structure of the data. The effectiveness of the proposed encoder is evaluated both extrinsically and intrinsically by shuffling input data without changing meaning of that data. For selecting appropriate content information in encoded data from encoder, the proposed model incorporates attention gates in the decoder. With extensive experiments on WikiBio and E2E dataset, we show that our model outperforms the state-of-the models and several standard baseline systems. Analysis of the model through component ablation tests and human evaluation endorse the proposed model as a well-grounded system.


Unsupervised Morpheme Segmentation Through Numerical Weighting and Thresholding
Joy Mahapatra | Sudip Kumar Naskar
Proceedings of the 14th International Conference on Natural Language Processing (ICON-2017)


Statistical Natural Language Generation from Tabular Non-textual Data
Joy Mahapatra | Sudip Kumar Naskar | Sivaji Bandyopadhyay
Proceedings of the 9th International Natural Language Generation conference