Proceedings of the NAACL-HLT 2012 Workshop: Will We Ever Really Replace the N-gram Model? On the Future of Language Modeling for HLT
Bhuvana Ramabhadran, Sanjeev Khudanpur, Ebru Arisoy (Editors)
- Anthology ID:
- W12-27
- Month:
- June
- Year:
- 2012
- Address:
- Montréal, Canada
- Venue:
- NAACL
- SIG:
- Publisher:
- Association for Computational Linguistics
- URL:
- https://aclanthology.org/W12-27
- DOI:
- PDF:
- https://preview.aclanthology.org/ml4al-ingestion/W12-27.pdf
Proceedings of the NAACL-HLT 2012 Workshop: Will We Ever Really Replace the N-gram Model? On the Future of Language Modeling for HLT
Bhuvana Ramabhadran
|
Sanjeev Khudanpur
|
Ebru Arisoy
Measuring the Influence of Long Range Dependencies with Neural Network Language Models
Hai Son Le
|
Alexandre Allauzen
|
François Yvon
Large, Pruned or Continuous Space Language Models on a GPU for Statistical Machine Translation
Holger Schwenk
|
Anthony Rousseau
|
Mohammed Attik
Deep Neural Network Language Models
Ebru Arisoy
|
Tara N. Sainath
|
Brian Kingsbury
|
Bhuvana Ramabhadran
A Challenge Set for Advancing Language Modeling
Geoffrey Zweig
|
Chris J.C. Burges
Unsupervised Vocabulary Adaptation for Morph-based Language Models
André Mansikkaniemi
|
Mikko Kurimo
Large-scale discriminative language model reranking for voice-search
Preethi Jyothi
|
Leif Johnson
|
Ciprian Chelba
|
Brian Strope
Revisiting the Case for Explicit Syntactic Information in Language Models
Ariya Rastrow
|
Sanjeev Khudanpur
|
Mark Dredze