@inproceedings{saunack-etal-2021-low,
    title = "How low is too low? A monolingual take on lemmatisation in {I}ndian languages",
    author = "Saunack, Kumar  and
      Saurav, Kumar  and
      Bhattacharyya, Pushpak",
    editor = "Toutanova, Kristina  and
      Rumshisky, Anna  and
      Zettlemoyer, Luke  and
      Hakkani-Tur, Dilek  and
      Beltagy, Iz  and
      Bethard, Steven  and
      Cotterell, Ryan  and
      Chakraborty, Tanmoy  and
      Zhou, Yichao",
    booktitle = "Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies",
    month = jun,
    year = "2021",
    address = "Online",
    publisher = "Association for Computational Linguistics",
    url = "https://preview.aclanthology.org/ingest-emnlp/2021.naacl-main.322/",
    doi = "10.18653/v1/2021.naacl-main.322",
    pages = "4088--4094",
    abstract = "Lemmatization aims to reduce the sparse data problem by relating the inflected forms of a word to its dictionary form. Most prior work on ML based lemmatization has focused on high resource languages, where data sets (word forms) are readily available. For languages which have no linguistic work available, especially on morphology or in languages where the computational realization of linguistic rules is complex and cumbersome, machine learning based lemmatizers are the way togo. In this paper, we devote our attention to lemmatisation for low resource, morphologically rich scheduled Indian languages using neural methods. Here, low resource means only a small number of word forms are available. We perform tests to analyse the variance in monolingual models' performance on varying the corpus size and contextual morphological tag data for training. We show that monolingual approaches with data augmentation can give competitive accuracy even in the low resource setting, which augurs well for NLP in low resource setting."
}Markdown (Informal)
[How low is too low? A monolingual take on lemmatisation in Indian languages](https://preview.aclanthology.org/ingest-emnlp/2021.naacl-main.322/) (Saunack et al., NAACL 2021)
ACL