@inproceedings{yutao-etal-2022-interactive,
    title = "Interactive {M}ongolian Question Answer Matching Model Based on Attention Mechanism in the Law Domain",
    author = "Yutao, Peng  and
      Weihua, Wang  and
      Feilong, Bao",
    editor = "Sun, Maosong  and
      Liu, Yang  and
      Che, Wanxiang  and
      Feng, Yang  and
      Qiu, Xipeng  and
      Rao, Gaoqi  and
      Chen, Yubo",
    booktitle = "Proceedings of the 21st Chinese National Conference on Computational Linguistics",
    month = oct,
    year = "2022",
    address = "Nanchang, China",
    publisher = "Chinese Information Processing Society of China",
    url = "https://preview.aclanthology.org/ingest-emnlp/2022.ccl-1.79/",
    pages = "896--907",
    language = "eng",
    abstract = "``Mongolian question answer matching task is challenging, since Mongolian is a kind of lowresource language and its complex morphological structures lead to data sparsity. In this work, we propose an Interactive Mongolian Question Answer Matching Model (IMQAMM) based on attention mechanism for Mongolian question answering system. The key parts of the model are interactive information enhancement and max-mean pooling matching. Interactive information enhancement contains sequence enhancement and multi-cast attention. Sequence enhancement aims to provide a subsequent encoder with an enhanced sequence representation, and multi-cast attention is designed to generate scalar features through multiple attention mechanisms. MaxMean pooling matching is to obtain the matching vectors for aggregation. Moreover, we introduce Mongolian morpheme representation to better learn the semantic feature. The model experimented on the Mongolian corpus, which contains question-answer pairs of various categories in the law domain. Experimental results demonstrate that our proposed Mongolian question answer matching model significantly outperforms baseline models.''"
}Markdown (Informal)
[Interactive Mongolian Question Answer Matching Model Based on Attention Mechanism in the Law Domain](https://preview.aclanthology.org/ingest-emnlp/2022.ccl-1.79/) (Yutao et al., CCL 2022)
ACL