2024
pdf
abs
Exploring Domain Robust Lightweight Reward Models based on Router Mechanism
Hyuk Namgoong
|
Jeesu Jung
|
Sangkeun Jung
|
YoonHyung Roh
Findings of the Association for Computational Linguistics ACL 2024
Recent advancements in large language models have heavily relied on the large reward model from reinforcement learning from human feedback for fine-tuning. However, the use of a single reward model across various domains may not always be optimal, often requiring retraining from scratch when new domain data is introduced. To address these challenges, we explore the utilization of small language models operating in a domain-specific manner based on router mechanisms. Our three approaches are: 1) utilize mixture of experts to form a single reward model by modularizing an internal router and experts, 2) employing external router to select the appropriate reward model from multiple domain-specific models, and 3) the framework reduces parameter size by loading reward models and router adapters onto a single small language model using adapters. Experimental validation underscores the effectiveness of our approach, demonstrating performance comparable to baseline methods while also reducing the total parameter size.
pdf
abs
Guidance-Based Prompt Data Augmentation in Specialized Domains for Named Entity Recognition
Hyeonseok Kang
|
Hyein Seo
|
Jeesu Jung
|
Sangkeun Jung
|
Du-Seong Chang
|
Riwoo Chung
Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers)
While the abundance of rich and vast datasets across numerous fields has facilitated the advancement of natural language processing, sectors in need of specialized data types continue to struggle with the challenge of finding quality data. Our study introduces a novel guidance data augmentation technique utilizing abstracted context and sentence structures to produce varied sentences while maintaining context-entity relationships, addressing data scarcity challenges. By fostering a closer relationship between context, sentence structure, and role of entities, our method enhances data augmentation’s effectiveness. Consequently, by showcasing diversification in both entity-related vocabulary and overall sentence structure, and simultaneously improving the training performance of named entity recognition task.
2023
pdf
abs
Semantic Ambiguity Detection in Sentence Classification using Task-Specific Embeddings
Jong Myoung Kim
|
Young-jun Lee
|
Sangkeun Jung
|
Ho-jin Choi
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 5: Industry Track)
Ambiguity is a major obstacle to providing services based on sentence classification. However, because of the structural limitations of the service, there may not be sufficient contextual information to resolve the ambiguity. In this situation, we focus on ambiguity detection so that service design considering ambiguity is possible. We utilize similarity in a semantic space to detect ambiguity in service scenarios and training data. In addition, we apply task-specific embedding to improve performance. Our results demonstrate that ambiguities and resulting labeling errors in training data or scenarios can be detected. Additionally, we confirm that it can be used to debug services
2018
pdf
abs
Learning to Embed Semantic Correspondence for Natural Language Understanding
Sangkeun Jung
|
Jinsik Lee
|
Jiwon Kim
Proceedings of the 22nd Conference on Computational Natural Language Learning
While learning embedding models has yielded fruitful results in several NLP subfields, most notably Word2Vec, embedding correspondence has relatively not been well explored especially in the context of natural language understanding (NLU), a task that typically extracts structured semantic knowledge from a text. A NLU embedding model can facilitate analyzing and understanding relationships between unstructured texts and their corresponding structured semantic knowledge, essential for both researchers and practitioners of NLU. Toward this end, we propose a framework that learns to embed semantic correspondence between text and its extracted semantic knowledge, called semantic frame. One key contributed technique is semantic frame reconstruction used to derive a one-to-one mapping between embedded vectors and their corresponding semantic frames. Embedding into semantically meaningful vectors and computing their distances in vector space provides a simple, but effective way to measure semantic similarities. With the proposed framework, we demonstrate three key areas where the embedding model can be effective: visualization, semantic search and re-ranking.
2017
pdf
abs
Concept Equalization to Guide Correct Training of Neural Machine Translation
Kangil Kim
|
Jong-Hun Shin
|
Seung-Hoon Na
|
SangKeun Jung
Proceedings of the Eighth International Joint Conference on Natural Language Processing (Volume 2: Short Papers)
Neural machine translation decoders are usually conditional language models to sequentially generate words for target sentences. This approach is limited to find the best word composition and requires help of explicit methods as beam search. To help learning correct compositional mechanisms in NMTs, we propose concept equalization using direct mapping distributed representations of source and target sentences. In a translation experiment from English to French, the concept equalization significantly improved translation quality by 3.00 BLEU points compared to a state-of-the-art NMT model.
2009
pdf
Automatic Agenda Graph Construction from Human-Human Dialogs using Clustering Method
Cheongjae Lee
|
Sangkeun Jung
|
Kyungduk Kim
|
Gary Geunbae Lee
Proceedings of Human Language Technologies: The 2009 Annual Conference of the North American Chapter of the Association for Computational Linguistics, Companion Volume: Short Papers
pdf
Hybrid Approach to User Intention Modeling for Dialog Simulation
Sangkeun Jung
|
Cheongjae Lee
|
Kyungduk Kim
|
Gary Geunbae Lee
Proceedings of the ACL-IJCNLP 2009 Conference Short Papers
2008
pdf
A Frame-Based Probabilistic Framework for Spoken Dialog Management Using Dialog Examples
Kyungduk Kim
|
Cheongjae Lee
|
Sangkeun Jung
|
Gary Geunbae Lee
Proceedings of the 9th SIGdial Workshop on Discourse and Dialogue
pdf
An Integrated Dialog Simulation Technique for Evaluating Spoken Dialog Systems
Sangkeun Jung
|
Cheongjae Lee
|
Kyungduk Kim
|
Gary Geunbae Lee
Coling 2008: Proceedings of the workshop on Speech Processing for Safety Critical Translation and Pervasive Applications
pdf
Robust Dialog Management with N-Best Hypotheses Using Dialog Examples and Agenda
Cheongjae Lee
|
Sangkeun Jung
|
Gary Geunbae Lee
Proceedings of ACL-08: HLT