Joseph Turian

Also published as: Joseph P. Turian


2020

pdf
Experience Grounds Language
Yonatan Bisk | Ari Holtzman | Jesse Thomason | Jacob Andreas | Yoshua Bengio | Joyce Chai | Mirella Lapata | Angeliki Lazaridou | Jonathan May | Aleksandr Nisnevich | Nicolas Pinto | Joseph Turian
Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)

Language understanding research is held back by a failure to relate language to the physical world it describes and to the social interactions it facilitates. Despite the incredible effectiveness of language processing models to tackle tasks after being trained on text alone, successful linguistic communication relies on a shared experience of the world. It is this shared experience that makes utterances meaningful. Natural language processing is a diverse field, and progress throughout its development has come from new representational theories, modeling techniques, data collection paradigms, and tasks. We posit that the present success of representation learning approaches trained on large, text-only corpora requires the parallel tradition of research on the broader physical and social context of language to address the deeper questions of communication.

2010

pdf
Word Representations: A Simple and General Method for Semi-Supervised Learning
Joseph Turian | Lev-Arie Ratinov | Yoshua Bengio
Proceedings of the 48th Annual Meeting of the Association for Computational Linguistics

2009

pdf
Quadratic Features and Deep Architectures for Chunking
Joseph Turian | James Bergstra | Yoshua Bengio
Proceedings of Human Language Technologies: The 2009 Annual Conference of the North American Chapter of the Association for Computational Linguistics, Companion Volume: Short Papers

2006

pdf
Advances in Discriminative Parsing
Joseph Turian | I. Dan Melamed
Proceedings of the 21st International Conference on Computational Linguistics and 44th Annual Meeting of the Association for Computational Linguistics

pdf
Scalable Purely-Discriminative Training for Word and Tree Transducers
Benjamin Wellington | Joseph Turian | Chris Pike | Dan Melamed
Proceedings of the 7th Conference of the Association for Machine Translation in the Americas: Technical Papers

Discriminative training methods have recently led to significant advances in the state of the art of machine translation (MT). Another promising trend is the incorporation of syntactic information into MT systems. Combining these trends is difficult for reasons of system complexity and computational complexity. The present study makes progress towards a syntax-aware MT system whose every component is trained discriminatively. Our main innovation is an approach to discriminative learning that is computationally efficient enough for large statistical MT systems, yet whose accuracy on translation sub-tasks is near the state of the art. Our source code is downloadable from http://nlp.cs.nyu.edu/GenPar/.

pdf
Computational Challenges in Parsing by Classification
Joseph Turian | I. Dan Melamed
Proceedings of the Workshop on Computationally Hard Problems and Joint Inference in Speech and Language Processing

2005

pdf
Constituent Parsing by Classification
Joseph Turian | I. Dan Melamed
Proceedings of the Ninth International Workshop on Parsing Technology

2003

pdf
Precision and Recall of Machine Translation
I. Dan Melamed | Ryan Green | Joseph P. Turian
Companion Volume of the Proceedings of HLT-NAACL 2003 - Short Papers

pdf
Evaluation of machine translation and its evaluation
Joseph P. Turian | Luke Shen | I. Dan Melamed
Proceedings of Machine Translation Summit IX: Papers

Evaluation of MT evaluation measures is limited by inconsistent human judgment data. Nonetheless, machine translation can be evaluated using the well-known measures precision, recall, and their average, the F-measure. The unigram-based F-measure has significantly higher correlation with human judgments than recently proposed alternatives. More importantly, this standard measure has an intuitive graphical interpretation, which can facilitate insight into how MT systems might be improved. The relevant software is publicly available from http://nlp.cs.nyu.edu/GTM/.