Denny Britz


2018

pdf
JESC: Japanese-English Subtitle Corpus
Reid Pryzant | Youngjoo Chung | Dan Jurafsky | Denny Britz
Proceedings of the Eleventh International Conference on Language Resources and Evaluation (LREC 2018)

2017

pdf
Efficient Attention using a Fixed-Size Memory Representation
Denny Britz | Melody Guan | Minh-Thang Luong
Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing

The standard content-based attention mechanism typically used in sequence-to-sequence models is computationally expensive as it requires the comparison of large encoder and decoder states at each time step. In this work, we propose an alternative attention mechanism based on a fixed size memory representation that is more efficient. Our technique predicts a compact set of K attention contexts during encoding and lets the decoder compute an efficient lookup that does not need to consult the memory. We show that our approach performs on-par with the standard attention mechanism while yielding inference speedups of 20% for real-world translation tasks and more for tasks with longer sequences. By visualizing attention scores we demonstrate that our models learn distinct, meaningful alignments.

pdf
Massive Exploration of Neural Machine Translation Architectures
Denny Britz | Anna Goldie | Minh-Thang Luong | Quoc Le
Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing

Neural Machine Translation (NMT) has shown remarkable progress over the past few years, with production systems now being deployed to end-users. As the field is moving rapidly, it has become unclear which elements of NMT architectures have a significant impact on translation quality. In this work, we present a large-scale analysis of the sensitivity of NMT architectures to common hyperparameters. We report empirical results and variance numbers for several hundred experimental runs, corresponding to over 250,000 GPU hours on a WMT English to German translation task. Our experiments provide practical insights into the relative importance of factors such as embedding size, network depth, RNN cell type, residual connections, attention mechanism, and decoding heuristics. As part of this contribution, we also release an open-source NMT framework in TensorFlow to make it easy for others to reproduce our results and perform their own experiments.

pdf
Generating High-Quality and Informative Conversation Responses with Sequence-to-Sequence Models
Yuanlong Shao | Stephan Gouws | Denny Britz | Anna Goldie | Brian Strope | Ray Kurzweil
Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing

Sequence-to-sequence models have been applied to the conversation response generation problem where the source sequence is the conversation history and the target sequence is the response. Unlike translation, conversation responding is inherently creative. The generation of long, informative, coherent, and diverse responses remains a hard task. In this work, we focus on the single turn setting. We add self-attention to the decoder to maintain coherence in longer responses, and we propose a practical approach, called the glimpse-model, for scaling to large datasets. We introduce a stochastic beam-search algorithm with segment-by-segment reranking which lets us inject diversity earlier in the generation process. We trained on a combined data set of over 2.3B conversation messages mined from the web. In human evaluation studies, our method produces longer responses overall, with a higher proportion rated as acceptable and excellent as length increases, compared to baseline sequence-to-sequence models with explicit length-promotion. A back-off strategy produces better responses overall, in the full spectrum of lengths.

pdf
Effective Domain Mixing for Neural Machine Translation
Denny Britz | Quoc Le | Reid Pryzant
Proceedings of the Second Conference on Machine Translation