Guihong Cao


2021

pdf
K-Adapter: Infusing Knowledge into Pre-Trained Models with Adapters
Ruize Wang | Duyu Tang | Nan Duan | Zhongyu Wei | Xuanjing Huang | Jianshu Ji | Guihong Cao | Daxin Jiang | Ming Zhou
Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021

2020

pdf
XGLUE: A New Benchmark Dataset for Cross-lingual Pre-training, Understanding and Generation
Yaobo Liang | Nan Duan | Yeyun Gong | Ning Wu | Fenfei Guo | Weizhen Qi | Ming Gong | Linjun Shou | Daxin Jiang | Guihong Cao | Xiaodong Fan | Ruofei Zhang | Rahul Agrawal | Edward Cui | Sining Wei | Taroon Bharti | Ying Qiao | Jiun-Hung Chen | Winnie Wu | Shuguang Liu | Fan Yang | Daniel Campos | Rangan Majumder | Ming Zhou
Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)

In this paper, we introduce XGLUE, a new benchmark dataset to train large-scale cross-lingual pre-trained models using multilingual and bilingual corpora, and evaluate their performance across a diverse set of cross-lingual tasks. Comparing to GLUE (Wang et al.,2019), which is labeled in English and includes natural language understanding tasks only, XGLUE has three main advantages: (1) it provides two corpora with different sizes for cross-lingual pre-training; (2) it provides 11 diversified tasks that cover both natural language understanding and generation scenarios; (3) for each task, it provides labeled data in multiple languages. We extend a recent cross-lingual pre-trained model Unicoder (Huang et al., 2019) to cover both understanding and generation tasks, which is evaluated on XGLUE as a strong baseline. We also evaluate the base versions (12-layer) of Multilingual BERT, XLM and XLM-R for comparison.

pdf
The Microsoft Toolkit of Multi-Task Deep Neural Networks for Natural Language Understanding
Xiaodong Liu | Yu Wang | Jianshu Ji | Hao Cheng | Xueyun Zhu | Emmanuel Awa | Pengcheng He | Weizhu Chen | Hoifung Poon | Guihong Cao | Jianfeng Gao
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics: System Demonstrations

We present MT-DNN, an open-source natural language understanding (NLU) toolkit that makes it easy for researchers and developers to train customized deep learning models. Built upon PyTorch and Transformers, MT-DNN is designed to facilitate rapid customization for a broad spectrum of NLU tasks, using a variety of objectives (classification, regression, structured prediction) and text encoders (e.g., RNNs, BERT, RoBERTa, UniLM). A unique feature of MT-DNN is its built-in support for robust and transferable learning using the adversarial multi-task learning paradigm. To enable efficient production deployment, MT-DNN supports multi-task knowledge distillation, which can substantially compress a deep neural model without significant performance drop. We demonstrate the effectiveness of MT-DNN on a wide range of NLU applications across general and biomedical domains. The software and pre-trained models will be publicly available at https://github.com/namisan/mt-dnn.

2018

pdf
Semantic Parsing with Syntax- and Table-Aware SQL Generation
Yibo Sun | Duyu Tang | Nan Duan | Jianshu Ji | Guihong Cao | Xiaocheng Feng | Bing Qin | Ting Liu | Ming Zhou
Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)

We present a generative model to map natural language questions into SQL queries. Existing neural network based approaches typically generate a SQL query word-by-word, however, a large portion of the generated results is incorrect or not executable due to the mismatch between question words and table contents. Our approach addresses this problem by considering the structure of table and the syntax of SQL language. The quality of the generated SQL query is significantly improved through (1) learning to replicate content from column names, cells or SQL keywords; and (2) improving the generation of WHERE clause by leveraging the column-cell relation. Experiments are conducted on WikiSQL, a recently released dataset with the largest question- SQL pairs. Our approach significantly improves the state-of-the-art execution accuracy from 69.0% to 74.4%.

2008

pdf
Selecting Query Term Alternations for Web Search by Exploiting Query Contexts
Guihong Cao | Stephen Robertson | Jian-Yun Nie
Proceedings of ACL-08: HLT

2007

pdf
A system to mine large-scale bilingual dictionaries from monolingual web pages
Guihong Cao | Jianfeng Gao | Jian-Yun Nie
Proceedings of Machine Translation Summit XI: Papers

2006

pdf
An Information-Theoretic Approach to Automatic Evaluation of Summaries
Chin-Yew Lin | Guihong Cao | Jianfeng Gao | Jian-Yun Nie
Proceedings of the Human Language Technology Conference of the NAACL, Main Conference

pdf
Context-Dependent Term Relations for Information Retrieval
Jing Bai | Jian-Yun Nie | Guihong Cao
Proceedings of the 2006 Conference on Empirical Methods in Natural Language Processing

2005

pdf
NUKTI: English-Inuktitut Word Alignment System Description
Philippe Langlais | Fabrizio Gotti | Guihong Cao
Proceedings of the ACL Workshop on Building and Using Parallel Texts

pdf
RALI: SMT Shared Task System Description
Philippe Langlais | Guihong Cao | Fabrizio Gotti
Proceedings of the ACL Workshop on Building and Using Parallel Texts

2004

pdf
Combining Linguistic Features with Weighted Bayesian Classifier for Temporal Reference Processing
Guihong Cao | Wenjie Li | Kam-Fai Wong | Chunfa Yuan
COLING 2004: Proceedings of the 20th International Conference on Computational Linguistics

pdf
Applying Machine Learning to Chinese Temporal Relation Resolution
Wenjie Li | Kam-Fai Wong | Guihong Cao | Chunfa Yuan
Proceedings of the 42nd Annual Meeting of the Association for Computational Linguistics (ACL-04)

2002

pdf
Exploring Asymmetric Clustering for Statistical Language Modeling
Jianfeng Gao | Joshua Goodman | Guihong Cao | Hang Li
Proceedings of the 40th Annual Meeting of the Association for Computational Linguistics