Hodong Lee


2022

pdf
Don’t Judge a Language Model by Its Last Layer: Contrastive Learning with Layer-Wise Attention Pooling
Dongsuk Oh | Yejin Kim | Hodong Lee | H. Howie Huang | Heuiseok Lim
Proceedings of the 29th International Conference on Computational Linguistics

Recent pre-trained language models (PLMs) achieved great success on many natural language processing tasks through learning linguistic features and contextualized sentence representation. Since attributes captured in stacked layers of PLMs are not clearly identified, straightforward approaches such as embedding the last layer are commonly preferred to derive sentence representations from PLMs. This paper introduces the attention-based pooling strategy, which enables the model to preserve layer-wise signals captured in each layer and learn digested linguistic features for downstream tasks. The contrastive learning objective can adapt the layer-wise attention pooling to both unsupervised and supervised manners. It results in regularizing the anisotropic space of pre-trained embeddings and being more uniform. We evaluate our model on standard semantic textual similarity (STS) and semantic search tasks. As a result, our method improved the performance of the base contrastive learned BERTbase and variants.

2016

pdf
An Effective Diverse Decoding Scheme for Robust Synonymous Sentence Translation
Youngki Park | Hwidong Na | Hodong Lee | Jihyun Lee | Inchul Song
Conferences of the Association for Machine Translation in the Americas: MT Researchers' Track

2002

pdf
Natural Language Interpretations for Heterogeneous Database Access
Hodong Lee | Jong C. Park
COLING 2002: The 19th International Conference on Computational Linguistics

2001

pdf
Automatic Augmentation of Translation Dictionary with Database Terminologies In Multilingual Query Interpretation
Hodong Lee | Jong C. Park
Proceedings of the ACL 2001 Workshop on Human Language Technology and Knowledge Management