Fangqi Zhu
2023
Learning to Describe for Predicting Zero-shot Drug-Drug Interactions
Fangqi Zhu
|
Yongqi Zhang
|
Lei Chen
|
Bing Qin
|
Ruifeng Xu
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
Adverse drug-drug interactions (DDIs) can compromise the effectiveness of concurrent drug administration, posing a significant challenge in healthcare. As the development of new drugs continues, the potential for unknown adverse effects resulting from DDIs becomes a growing concern. Traditional computational methods for DDI prediction may fail to capture interactions for new drugs due to the lack of knowledge. In this paper, we introduce a new problem setup as zero-shot DDI prediction that deals with the case of new drugs. Leveraging textual information from online databases like DrugBank and PubChem, we propose an innovative approach TextDDI with a language model-based DDI predictor and a reinforcement learning (RL)-based information selector, enabling the selection of concise and pertinent text for accurate DDI prediction on new drugs. Empirical results show the benefits of the proposed approach on several settings including zero-shot and few-shot DDI prediction, and the selected texts are semantically relevant. Our code and data are available at https://github.com/zhufq00/DDIs-Prediction.
A Diffusion Model for Event Skeleton Generation
Fangqi Zhu
|
Lin Zhang
|
Jun Gao
|
Bing Qin
|
Ruifeng Xu
|
Haiqin Yang
Findings of the Association for Computational Linguistics: ACL 2023
Event skeleton generation, aiming to induce an event schema skeleton graph with abstracted event nodes and their temporal relations from a set of event instance graphs, is a critical step in the temporal complex event schema induction task. Existing methods effectively address this task from a graph generation perspective but suffer from noise-sensitive and error accumulation, e.g., the inability to correct errors while generating schema. We, therefore, propose a novel Diffusion Event Graph Model (DEGM) to address these issues. Our DEGM is the first workable diffusion model for event skeleton generation, where the embedding and rounding techniques with a custom edge-based loss are introduced to transform a discrete event graph into learnable latent representations. Furthermore, we propose a denoising training process to maintain the model’s robustness. Consequently, DEGM derives the final schema, where error correction is guaranteed by iteratively refining the latent representations during the schema generation process. Experimental results on three IED bombing datasets demonstrate that our DEGM achieves better results than other state-of-the-art baselines. Our code and data are available at https://github.com/zhufq00/EventSkeletonGeneration.
Search
Co-authors
- Bing Qin 2
- Ruifeng Xu 2
- Yongqi Zhang 1
- Lei Chen 1
- Lin Zhang 1
- show all...