MENU

Synopsys to add generative AI to design tools

Synopsys to add generative AI to design tools

Technology News |
By Nick Flaherty

Cette publication existe aussi en Français


Synopsys is looking to add generative AI to its EDA tools to boost chip design productivity.

The company already uses a range of AI techniques in its tools from deep neural networks (DNNs) to recursive neural networks (RNNs). These are incorporated into the DSO.ai, VSI.ai and TSO.ai tools that have been used for well over 100 chip tapeouts.

Now the company is looking at the transformer network technologies used in generative AI (Gen-AI) to further enhance the tools says founder and retiring CEO Aart de Geus.

“We intend to harness Gen-AI capabilities into Synopsys.ai. We see this delivering further advances in design assistance, design exploration, and design generation,” said de Geus.

Generative AI uses transformers in large language models for a variety of applications, from chat bots such as ChatGPT to coding co-pilot tools. These tools can be used with programming languages to enhance engineer productivity, from finding bugs to adding documentation. The tools can also be used to generate Verilog code for synthesis of loci functions, for example automatically generating RISC-V microcontroller cores with associated test and verification harnesses.

Synopsys has a large database of IP in Verilog that can be used to train LLMs to improve the accuracy of generative AI frameworks to produce more reliable designs and avoid the problems of hallucination where the frameworks invent results.  

“On the design flow spectrum from optionality to optimality, in other words, moving from many options in early architectures, to highly-tuned, error-free tape-outs, Gen-AI techniques will augment the exploration, accelerate design choices, and automate some design generation, thus broadening the dimensions of intelligence in Synopsys.ai.” 

At the Design Automation Conference (DAC) conference this year Synopsys and Georgia Tech won the Best Paper Award for their work on Self-Supervised Reinforcement Learning. The collaborative research into methods to drive concurrent clock and data-path (CCD) optimization in physical design to achieve the highest frequency and lowest power chips.

Reinforcement learning CCD (RL-CCD) emerged as a method to improve CCD quality by prioritizing endpoints for useful skew optimization using a self-supervised attention mechanism. The team’s experimental results on 18 industrial designs at 5nm to 12nm process technologies demonstrated that RL-CCD can deliver up to 64% better total negative slack (TNS) compared to the best-in-class production solution for CCD.

www.synopsys.com

Related generative AI chip design articles

If you enjoyed this article, you will like the following ones: don't miss them by subscribing to :    eeNews on Google News

Share:

Linked Articles
10s