SynC-LLM: Generation of Large-Scale Synthetic Circuit Code with Hierarchical Language Models

Shang Liu, Yao Lu, Wenji Fang, Jing Wang, Zhiyao Xie


Abstract
In recent years, AI-assisted integrated circuit (IC) design methods have shown great potential in boosting IC design efficiency. However, this emerging technique is fundamentally limited by the serious scarcity of publicly accessible large-scale circuit design data, which are mostly private IPs owned by semiconductor companies. In this work, we propose SynC-LLM, the first technique that exploits LLM’s ability to generate new large-scale synthetic digital circuits. In our hierarchical circuit generation process, we first design a directed graph diffusion model to learn and generate the skeleton of large circuits with sequential cells. Then we propose a cone function retrieval technique to annotate each sequential node in the skeleton with a function description. Finally, we apply a level-by-level customized prompting technique utilizing LLM to complete the code at every skeleton cone. Experiments show that our generated circuits are not only valid and fully functional, but also closely resemble realistic large-scale designs and can significantly improve AI models’ performance in multiple IC design tasks. The code and data are open-sourced in https://github.com/hkust-zhiyao/SynCircuitData.
Anthology ID:
2025.emnlp-main.877
Volume:
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
17361–17376
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.877/
DOI:
Bibkey:
Cite (ACL):
Shang Liu, Yao Lu, Wenji Fang, Jing Wang, and Zhiyao Xie. 2025. SynC-LLM: Generation of Large-Scale Synthetic Circuit Code with Hierarchical Language Models. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing, pages 17361–17376, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
SynC-LLM: Generation of Large-Scale Synthetic Circuit Code with Hierarchical Language Models (Liu et al., EMNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.877.pdf
Checklist:
 2025.emnlp-main.877.checklist.pdf