A Syntactic Neural Model for General-Purpose Code Generation

Pengcheng Yin, Graham Neubig


Abstract
We consider the problem of parsing natural language descriptions into source code written in a general-purpose programming language like Python. Existing data-driven methods treat this problem as a language generation task without considering the underlying syntax of the target programming language. Informed by previous work in semantic parsing, in this paper we propose a novel neural architecture powered by a grammar model to explicitly capture the target syntax as prior knowledge. Experiments find this an effective way to scale up to generation of complex programs from natural language descriptions, achieving state-of-the-art results that well outperform previous code generation and semantic parsing approaches.
Anthology ID:
P17-1041
Volume:
Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2017
Address:
Vancouver, Canada
Editors:
Regina Barzilay, Min-Yen Kan
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
440–450
Language:
URL:
https://aclanthology.org/P17-1041
DOI:
10.18653/v1/P17-1041
Bibkey:
Cite (ACL):
Pengcheng Yin and Graham Neubig. 2017. A Syntactic Neural Model for General-Purpose Code Generation. In Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 440–450, Vancouver, Canada. Association for Computational Linguistics.
Cite (Informal):
A Syntactic Neural Model for General-Purpose Code Generation (Yin & Neubig, ACL 2017)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-1/P17-1041.pdf
Note:
 P17-1041.Notes.pdf
Video:
 https://preview.aclanthology.org/nschneid-patch-1/P17-1041.mp4
Code
 additional community code