Close

Presentation

Invited: HDL-GPT: High Quality HDL Is All You Need
DescriptionThis talk presents Hardware Description Language Generative Pre-Trained Transformers (HDL-GPT), a novel approach that leverages the vast repository of open-source Hardware Description Language (HDL) codes to train superior quality large code models. The core premise of this research is the hypothesis that high-quality HDL is all you need to create models with exceptional performance and broad zero-shot generalization abilities. The talk elucidates the methods employed for the curation and augmentation of large corpora from open-source HDL code, transforming a highly variable quality data into high-quality data through careful prompting and context maintenance. We observe that the careful selection, filtering, and augmentation of data across HDLs can yield powerful models that surpass current state-of-the-art models. We also explore the impact of different fine-tuning methods on quality of results. We analyzed and performed experiments across a range of fine-tuned state-of-the-art LLMs. We demonstrate improvements of 50% to 200% over state-of-the-art HDL models on current benchmarks in tasks ranging from HDL circuit explanations, code generation, formal and simulation testbench creation, bug finding and fixing, to tasks in high-speed circuit design. HDLGPT opens new avenues for the development of advanced model training techniques for circuit design tasks.
Event Type
Special Session (Research)
TimeWednesday, June 262:00pm - 2:30pm PDT
Location3006, 3rd Floor
Topics
AI