@inproceedings{ray-etal-2019-fast,
    title = "Fast Domain Adaptation of Semantic Parsers via Paraphrase Attention",
    author = "Ray, Avik  and
      Shen, Yilin  and
      Jin, Hongxia",
    editor = "Cherry, Colin  and
      Durrett, Greg  and
      Foster, George  and
      Haffari, Reza  and
      Khadivi, Shahram  and
      Peng, Nanyun  and
      Ren, Xiang  and
      Swayamdipta, Swabha",
    booktitle = "Proceedings of the 2nd Workshop on Deep Learning Approaches for Low-Resource NLP (DeepLo 2019)",
    month = nov,
    year = "2019",
    address = "Hong Kong, China",
    publisher = "Association for Computational Linguistics",
    url = "https://preview.aclanthology.org/iwcs-25-ingestion/D19-6111/",
    doi = "10.18653/v1/D19-6111",
    pages = "94--103",
    abstract = "Semantic parsers are used to convert user{'}s natural language commands to executable logical form in intelligent personal agents. Labeled datasets required to train such parsers are expensive to collect, and are never comprehensive. As a result, for effective post-deployment domain adaptation and personalization, semantic parsers are continuously retrained to learn new user vocabulary and paraphrase variety. However, state-of-the art attention based neural parsers are slow to retrain which inhibits real time domain adaptation. Secondly, these parsers do not leverage numerous paraphrases already present in the training dataset. Designing parsers which can simultaneously maintain high accuracy and fast retraining time is challenging. In this paper, we present novel paraphrase attention based sequence-to-sequence/tree parsers which support fast near real time retraining. In addition, our parsers often boost accuracy by jointly modeling the semantic dependencies of paraphrases. We evaluate our model on benchmark datasets to demonstrate upto 9X speedup in retraining time compared to existing parsers, as well as achieving state-of-the-art accuracy."
}Markdown (Informal)
[Fast Domain Adaptation of Semantic Parsers via Paraphrase Attention](https://preview.aclanthology.org/iwcs-25-ingestion/D19-6111/) (Ray et al., 2019)
ACL