Xuansong Xie
2023
WordArt Designer: User-Driven Artistic Typography Synthesis using Large Language Models
Jun-Yan He
|
Zhi-Qi Cheng
|
Chenyang Li
|
Jingdong Sun
|
Wangmeng Xiang
|
Xianhui Lin
|
Xiaoyang Kang
|
Zengke Jin
|
Yusen Hu
|
Bin Luo
|
Yifeng Geng
|
Xuansong Xie
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing: Industry Track
This paper introduces WordArt Designer, a user-driven framework for artistic typography synthesis, relying on the Large Language Model (LLM). The system incorporates four key modules: the LLM Engine, SemTypo, StyTypo, and TexTypo modules. 1) The LLM Engine, empowered by the LLM (e.g. GPT-3.5), interprets user inputs and generates actionable prompts for the other modules, thereby transforming abstract concepts into tangible designs. 2) The SemTypo module optimizes font designs using semantic concepts, striking a balance between artistic transformation and readability. 3) Building on the semantic layout provided by the SemTypo module, the StyTypo module creates smooth, refined images. 4) The TexTypo module further enhances the design’s aesthetics through texture rendering, enabling the generation of inventive textured fonts. Notably, WordArt Designer highlights the fusion of generative AI with artistic typography. Experience its capabilities on ModelScope: https://www.modelscope.cn/studios/WordArt/WordArt.
2019
A Multi-Task Learning Framework for Extracting Bacteria Biotope Information
Qi Zhang
|
Chao Liu
|
Ying Chi
|
Xuansong Xie
|
Xiansheng Hua
Proceedings of the 5th Workshop on BioNLP Open Shared Tasks
This paper presents a novel transfer multi-task learning method for Bacteria Biotope rel+ner task at BioNLP-OST 2019. To alleviate the data deficiency problem in domain-specific information extraction, we use BERT(Bidirectional Encoder Representations from Transformers) and pre-train it using mask language models and next sentence prediction on both general corpus and medical corpus like PubMed. In fine-tuning stage, we fine-tune the relation extraction layer and mention recognition layer designed by us on the top of BERT to extract mentions and relations simultaneously. The evaluation results show that our method achieves the best performance on all metrics (including slot error rate, precision and recall) in the Bacteria Biotope rel+ner subtask.
Search
Co-authors
- Jun-Yan He 1
- Zhi-Qi Cheng 1
- Chenyang Li 1
- Jingdong Sun 1
- Wangmeng Xiang 1
- show all...