Qi Wang


2025

pdf bib
MicroEdit: Neuron-level Knowledge Disentanglement and Localization in Lifelong Model Editing
Shiqi Wang | Qi Wang | Runliang Niu | He Kong | Yi Chang
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing

Large language models (LLMs) require continual knowledge updates to keep pace with the evolving world. While various model editing methods have been proposed, most face critical challenges in the context of lifelong learning due to two fundamental limitations: (1) Edit Overshooting - parameter updates intended for a specific fact spill over to unrelated regions, causing interference with previously retained knowledge; and (2) Knowledge Entanglement - polysemantic neurons’ overlapping encoding of multiple concepts makes it difficult to isolate and edit a single fact. In this paper, we propose MicroEdit, a neuron-level editing method that performs minimal and controlled interventions within LLMs. By leveraging a sparse autoencoder (SAE), MicroEdit disentangles knowledge representations and activates only a minimal set of necessary neurons for precise parameter updates. This targeted design enables fine-grained control over the editing scope, effectively mitigating interference and preserving unrelated knowledge. Extensive experiments show that MicroEdit outperforms prior methods and robustly handles lifelong knowledge editing across QA and Hallucination settings on LLaM and Mistral.

pdf bib
Lattice @MultiGEC-2025: A Spitful Multilingual Language Error Correction System Using LLaMA
Olga Seminck | Yoann Dupont | Mathieu Dehouck | Qi Wang | Noé Durandard | Margo Novikov
Proceedings of the 14th Workshop on Natural Language Processing for Computer Assisted Language Learning

2024

pdf bib
An Incremental Clustering Baseline for Event Detection on Twitter
Marjolaine Ray | Qi Wang | Frédérique Mélanie-Becquet | Thierry Poibeau | Béatrice Mazoyer
Proceedings of the Workshop on the Future of Event Detection (FuturED)

enter abstract here

2020

pdf bib
Knowledge-Enhanced Named Entity Disambiguation for Short Text
Zhifan Feng | Qi Wang | Wenbin Jiang | Yajuan Lyu | Yong Zhu
Proceedings of the 1st Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 10th International Joint Conference on Natural Language Processing

Named entity disambiguation is an important task that plays the role of bridge between text and knowledge. However, the performance of existing methods drops dramatically for short text, which is widely used in actual application scenarios, such as information retrieval and question answering. In this work, we propose a novel knowledge-enhanced method for named entity disambiguation. Considering the problem of information ambiguity and incompleteness for short text, two kinds of knowledge, factual knowledge graph and conceptual knowledge graph, are introduced to provide additional knowledge for the semantic matching between candidate entity and mention context. Our proposed method achieves significant improvement over previous methods on a large manually annotated short-text dataset, and also achieves the state-of-the-art on three standard datasets. The short-text dataset and the proposed model will be publicly available for research use.