Nima Pourdamghani


2020

pdf bib
Severing the Edge Between Before and After: Neural Architectures for Temporal Ordering of Events
Miguel Ballesteros | Rishita Anubhai | Shuai Wang | Nima Pourdamghani | Yogarshi Vyas | Jie Ma | Parminder Bhatia | Kathleen McKeown | Yaser Al-Onaizan
Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)

In this paper, we propose a neural architecture and a set of training methods for ordering events by predicting temporal relations. Our proposed models receive a pair of events within a span of text as input and they identify temporal relations (Before, After, Equal, Vague) between them. Given that a key challenge with this task is the scarcity of annotated data, our models rely on either pretrained representations (i.e. RoBERTa, BERT or ELMo), transfer and multi-task learning (by leveraging complementary datasets), and self-training techniques. Experiments on the MATRES dataset of English documents establish a new state-of-the-art on this task.

2019

pdf bib
Translating Translationese: A Two-Step Approach to Unsupervised Machine Translation
Nima Pourdamghani | Nada Aldarrab | Marjan Ghazvininejad | Kevin Knight | Jonathan May
Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics

Given a rough, word-by-word gloss of a source language sentence, target language natives can uncover the latent, fully-fluent rendering of the translation. In this work we explore this intuition by breaking translation into a two step process: generating a rough gloss by means of a dictionary and then ‘translating’ the resulting pseudo-translation, or ‘Translationese’ into a fully fluent translation. We build our Translationese decoder once from a mish-mash of parallel data that has the target language in common and then can build dictionaries on demand using unsupervised techniques, resulting in rapidly generated unsupervised neural MT systems for many source languages. We apply this process to 14 test languages, obtaining better or comparable translation results on high-resource languages than previously published unsupervised MT studies, and obtaining good quality results for low-resource languages that have never been used in an unsupervised MT scenario.

2018

pdf bib
Using Word Vectors to Improve Word Alignments for Low Resource Machine Translation
Nima Pourdamghani | Marjan Ghazvininejad | Kevin Knight
Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 2 (Short Papers)

We present a method for improving word alignments using word similarities. This method is based on encouraging common alignment links between semantically similar words. We use word vectors trained on monolingual data to estimate similarity. Our experiments on translating fifteen languages into English show consistent BLEU score improvements across the languages.

2017

pdf bib
Deciphering Related Languages
Nima Pourdamghani | Kevin Knight
Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing

We present a method for translating texts between close language pairs. The method does not require parallel data, and it does not require the languages to be written in the same script. We show results for six language pairs: Afrikaans/Dutch, Bosnian/Serbian, Danish/Swedish, Macedonian/Bulgarian, Malaysian/Indonesian, and Polish/Belorussian. We report BLEU scores showing our method to outperform others that do not use parallel data.

2016

pdf bib
Generating English from Abstract Meaning Representations
Nima Pourdamghani | Kevin Knight | Ulf Hermjakob
Proceedings of the 9th International Natural Language Generation conference

pdf bib
A Multi-media Approach to Cross-lingual Entity Knowledge Transfer
Di Lu | Xiaoman Pan | Nima Pourdamghani | Shih-Fu Chang | Heng Ji | Kevin Knight
Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)

2014

pdf bib
Aligning English Strings with Abstract Meaning Representation Graphs
Nima Pourdamghani | Yang Gao | Ulf Hermjakob | Kevin Knight
Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP)