Close

Session

Research Manuscript: Foundation Models for EDA and Beyond
DescriptionThis session will dive into the intersection of machine learning with EDA and other cutting-edge applications. Attendees will witness how large language models (LLMs) revolutionize tasks from fixing RTL syntax errors, designing operational amplifiers and dramatically cutting down large training times of protein folding with AlphaFold. The session will subsequently explore sustainable benchmarking in accelerator-aware NAS, real-time network traffic analytics, anomaly detection at the edge and ML-driven optimization of physical design parameters for 3D ICs.
Event TypeResearch Manuscript
TimeThursday, June 271:30pm - 3:30pm PDT
Location3002, 3rd Floor
Topics
AI
Keywords
AI/ML Application and Infrastructure
Presentations
1:30pm - 1:45pm PDTAutomatically Fixing RTL Syntax Errors with Large Language Model
1:45pm - 2:00pm PDTArtisan: Automated Operational Amplifier Design via Domain-specific Large Language Model
2:00pm - 2:15pm PDTScaleFold: Reducing AlphaFold Initial Training Time to 10 Hours
2:15pm - 2:30pm PDTData is all you need: Finetuning LLMs for Chip Design via an Automated design-data augmentation framework
2:30pm - 2:45pm PDTAccel-NASBench: Sustainable Benchmarking for Accelerator-Aware NAS
2:45pm - 3:00pm PDTTrafficHD: Efficient Hyperdimensional Computing for Real-Time Network Traffic Analytics
3:00pm - 3:15pm PDTVARADE: a Variational-based AutoRegressive model for Anomaly Detection on the Edge
3:15pm - 3:30pm PDTML-based Physical Design Parameter Optimization for 3D ICs: From Parameter Selection to Optimization