Close

Presentation

Circuit Transformer: End-to-end Logic Synthesis by Predicting the Next Gate
DescriptionRecent advances in large language models (LLMs) have computationally mastered human language through predictive modeling. Extending this concept to electronic design, we explore the idea of a "circuit model" trained on circuits to predict the next logic gate, addressing structural complexities and equivalence constraints. By encoding circuits as memory-less trajectories and employing equivalence-preserving decoding, our trained "Circuit Transformer" with 88M parameters demonstrates impressive performance in end-to-end logic synthesis. With the aid of Monte-Carlo tree search, it significantly outperforms resyn2 in ABC on small circuits while retaining strict equivalence, showcasing the potential of generative AI in conquering electronic design challenges.
Event Type
Work-in-Progress Poster
TimeWednesday, June 265:00pm - 6:00pm PDT
LocationLevel 2 Lobby
Topics
AI
Autonomous Systems
Cloud
Design
EDA
Embedded Systems
IP
Security