2023
pdf
abs
CFSum Coarse-to-Fine Contribution Network for Multimodal Summarization
Min Xiao
|
Junnan Zhu
|
Haitao Lin
|
Yu Zhou
|
Chengqing Zong
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Multimodal summarization usually suffers from the problem that the contribution of the visual modality is unclear. Existing multimodal summarization approaches focus on designing the fusion methods of different modalities, while ignoring the adaptive conditions under which visual modalities are useful. Therefore, we propose a novel Coarse-to-Fine contribution network for multimodal Summarization (CFSum) to consider different contributions of images for summarization. First, to eliminate the interference of useless images, we propose a pre-filter module to abandon useless images. Second, to make accurate use of useful images, we propose two levels of visual complement modules, word level and phrase level. Specifically, image contributions are calculated and are adopted to guide the attention of both textual and visual modalities. Experimental results have shown that CFSum significantly outperforms multiple strong baselines on the standard benchmark. Furthermore, the analysis verifies that useful images can even help generate non-visual words which are implicitly represented in the image.
pdf
abs
Multi-doc Hybrid Summarization via Salient Representation Learning
Min Xiao
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 5: Industry Track)
Multi-document summarization is gaining more and more attention recently and serves as an invaluable tool to obtain key facts among a large information pool. In this paper, we proposed a multi-document hybrid summarization approach, which simultaneously generates a human-readable summary and extracts corresponding key evidences based on multi-doc inputs. To fulfill that purpose, we crafted a salient representation learning method to induce latent salient features, which are effective for joint evidence extraction and summary generation. In order to train this model, we conducted multi-task learning to optimize a composited loss, constructed over extractive and abstractive sub-components in a hierarchical way. We implemented the system based on a ubiquiotously adopted transformer architecture and conducted experimental studies on multiple datasets across two domains, achieving superior performance over the baselines.
2019
pdf
abs
STAC: Science Toolkit Based on Chinese Idiom Knowledge Graph
Meiling Wang
|
Min Xiao
|
Changliang Li
|
Yu Guo
|
Zhixin Zhao
|
Xiaonan Liu
Proceedings of the Workshop on Extracting Structured Knowledge from Scientific Publications
Chinese idioms (Cheng Yu) have seen five thousand years’ history and culture of China, meanwhile they contain large number of scientific achievement of ancient China. However, existing Chinese online idiom dictionaries have limited function for scientific exploration. In this paper, we first construct a Chinese idiom knowledge graph by extracting domains and dynasties and associating them with idioms, and based on the idiom knowledge graph, we propose a Science Toolkit for Ancient China (STAC) aiming to support scientific exploration. In the STAC toolkit, idiom navigator helps users explore overall scientific progress from idiom perspective with visualization tools, and idiom card and idiom QA shorten action path and avoid thinking being interrupted while users are reading and writing. The current STAC toolkit is deployed at http://120.92.208.22:7476/demo/#/stac.
2015
pdf
Learning Hidden Markov Models with Distributed State Representations for Domain Adaptation
Min Xiao
|
Yuhong Guo
Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing (Volume 2: Short Papers)
pdf
Annotation Projection-based Representation Learning for Cross-lingual Dependency Parsing
Min Xiao
|
Yuhong Guo
Proceedings of the Nineteenth Conference on Computational Natural Language Learning
2014
pdf
Distributed Word Representation Learning for Cross-Lingual Dependency Parsing
Min Xiao
|
Yuhong Guo
Proceedings of the Eighteenth Conference on Computational Natural Language Learning
2013
pdf
Learning Latent Word Representations for Domain Adaptation using Supervised Word Clustering
Min Xiao
|
Feipeng Zhao
|
Yuhong Guo
Proceedings of the 2013 Conference on Empirical Methods in Natural Language Processing
pdf
Semi-Supervised Representation Learning for Cross-Lingual Text Classification
Min Xiao
|
Yuhong Guo
Proceedings of the 2013 Conference on Empirical Methods in Natural Language Processing
pdf
bib
Online Active Learning for Cost Sensitive Domain Adaptation
Min Xiao
|
Yuhong Guo
Proceedings of the Seventeenth Conference on Computational Natural Language Learning
2012
pdf
Multi-View AdaBoost for Multilingual Subjectivity Analysis
Min Xiao
|
Yuhong Guo
Proceedings of COLING 2012
pdf
Semi-supervised Representation Learning for Domain Adaptation using Dynamic Dependency Networks
Min Xiao
|
Yuhong Guo
|
Alexander Yates
Proceedings of COLING 2012