MENU

Digging EDA’s big data to speed up chip verification and tape-out

Digging EDA’s big data to speed up chip verification and tape-out

Feature articles |
By eeNews Europe



Now, by sweeping through the big data generated by multi-vendor tool suites as a design is being simulated, verified and debugged, the company says it can speed up verification analytics by 10 to 100x while being able to pin-point verification bottlenecks tied to particular design changes, helping companies allocate their resources in near real-time to accelerate their verification schedules.

Catching up with Dean Drako, IC Manage’s President and CEO, also co-founder of the company, eeNews Europe got a glimpse of how this all got started.

“Up until 2014/2015, we were a design management company, we would manage all the data necessary to get the chips out of the doors. There are literally millions of millions of files that are checked, need editing, then engineers must ensure that the right files are used in their final design” explained Drako.

“Around 2014, we started working on the big data churned out by the different EDA tools. It is very different from the data of the chip itself which is very structured and hierarchized, with transistors and gates and IP blocks. When doing all the simulations, synthesis and verification runs, you generate a tremendous number of log files. That runtime data, normally you don’t have a place to store it, you generate a Gigabyte file, look at it, if it fails you delete it and do another run. That data is very unstructured, can come from many different tools, it just looks like massive text files” told us Drako.

“Semiconductor companies can run hundreds of thousands of regressions a day, and existing methodologies are not scaling with the increasing data” he says, that’s gigabytes of data produced every day.

“By looking into these files, we had a very special goal, to optimize resource allocations. If you think of complex chip designs, you have thousands of engineers, millions of modules. Performing analytics on this data, we bring a methodology to know if you are on schedule. We can also compare with previous design efforts to get an insight on how the design is progressing”.


The new tool, Envision Verification Analytics (Envision-VA) uses big data algorithms to deliver customized, near real-time visual analytics and interactive reports across mixed vendor verification environments. It relies on predictive analytics, user behaviour analytics and other advanced data analytics to extract value from the data so verification teams can efficiently track the progress of their verification at a whole chip level, identifying bottlenecks to re-allocate their resources if need be. It is the first tool in the company’s novel Big Data Labs initiative.

Through a web-based, secure interface, authorized team members get instant access to actionable, interactive reports based on the verification results data. Teams can also set regression pass/fail milestones and see the changes in progress almost immediately.

Bug tracking analytics lets engineers assign a ticket for further debug when a test fails, or update the bug status to pass, fail, or needs investigation. Then coverage results can be compared with pre-set targets and plans set across any number of axes, such as line, branch, FSM, code, and functional. RTL designers can resolve verification bottlenecks with direct links from the Envision VA reports to their design and IP changes and activity. A regression set that changes from pass to fail can be linked to design source files or test bench changes, for root cause analysis.

Envision’s default user interface can be customized for each team’s data fields and metrics with targets. It has a RESTful API to push verification results and log data into the tool, customize targets and integrate into business intelligence tools.


Talking about the Big Data Labs platform of which Envision VA is the first product, Drako noted that by having provided such analytics for customers during a number of years, the company had realized that many other tools could benefit the semiconductor industry based on big data analytics.

“What we mostly envision is that we’ll partner with semiconductor companies that have specific problems and figure out how to solve these problems, then create tools that would meet their precise needs, customizable for others to use.

“We could also partner with EDA companies. We work with all the tools from all the top EDA vendors, our solution is an overlay on top of these tools, so you can see the design progress for many different aspects of a chip” Drako says.

Asked if this sort of data analytics could be performed horizontally across multiple chip designs from different companies, blending in machine learning to extract yet to be discovered patterns of success or failure, the CEO doesn’t see it as something likely to happen in the very secretive world of IP design.

“We are not currently looking at, say 200 chip designs over ten years, sifting and trying to find interesting things, we are not doing it between different companies either, and their design methodologies are so different that it may not be fruitful anyway. We are open to try but there are no customer discussions geared toward that, asking to scan all their designs across time and find revealing patterns.

IC Manage – www.icmanage.com

If you enjoyed this article, you will like the following ones: don't miss them by subscribing to :    eeNews on Google News

Share:

Linked Articles
10s