Joshua Goodman

Also published as: Joshua T. Goodman


2004

pdf
Exponential Priors for Maximum Entropy Models
Joshua Goodman
Proceedings of the Human Language Technology Conference of the North American Chapter of the Association for Computational Linguistics: HLT-NAACL 2004

2003

pdf
The State of the Art in Language Modeling
Joshua Goodman
Companion Volume of the Proceedings of HLT-NAACL 2003 - Tutorial Abstracts

2002

pdf bib
The state of the art in language modeling
Joshua Goodman
Proceedings of the 5th Conference of the Association for Machine Translation in the Americas: Tutorial Descriptions

pdf bib
Sequential Conditional Generalized Iterative Scaling
Joshua Goodman
Proceedings of the 40th Annual Meeting of the Association for Computational Linguistics

pdf
Exploring Asymmetric Clustering for Statistical Language Modeling
Jianfeng Gao | Joshua Goodman | Guihong Cao | Hang Li
Proceedings of the 40th Annual Meeting of the Association for Computational Linguistics

pdf
An Incremental Decision List Learner
Joshua Goodman
Proceedings of the 2002 Conference on Empirical Methods in Natural Language Processing (EMNLP 2002)

2001

pdf bib
The Use of Clustering Techniques for Language Modeling V Application to Asian Language
Jianfeng Gao | Joshua T. Goodman | Jiangbo Miao
International Journal of Computational Linguistics & Chinese Language Processing, Volume 6, Number 1, February 2001: Special Issue on Natural Language Processing Researches in MSRA

1999

pdf
Semiring Parsing
Joshua Goodman
Computational Linguistics, Volume 25, Number 4, December 1999

1997

pdf bib
Global Thresholding and Multiple-Pass Parsing
Joshua Goodman
Second Conference on Empirical Methods in Natural Language Processing

pdf
Probabilistic Feature Grammars
Joshua Goodman
Proceedings of the Fifth International Workshop on Parsing Technologies

We present a new formalism, probabilistic feature grammar (PFG). PFGs combine most of the best properties of several other formalisms, including those of Collins, Magerman, and Charniak, and in experiments have comparable or better performance. PFGs generate features one at a time, probabilistically, conditioning the probabilities of each feature on other features in a local context. Because the conditioning is local, efficient polynomial time parsing algorithms exist for computing inside, outside, and Viterbi parses. PFGs can produce probabilities of strings, making them potentially useful for language modeling. Precision and recall results are comparable to the state of the art with words, and the best reported without words.

1996

pdf
Parsing Algorithms and Metrics
Joshua Goodman
34th Annual Meeting of the Association for Computational Linguistics

pdf
An Empirical Study of Smoothing Techniques for Language Modeling
Stanley F. Chen | Joshua Goodman
34th Annual Meeting of the Association for Computational Linguistics

pdf
Efficient Algorithms for Parsing the DOP Model
Joshua Goodman
Conference on Empirical Methods in Natural Language Processing