Digging EDA's big data to speed up chip verification and tape-out

December 15, 2017 // By Julien Happich
Founded in 2003 by seasoned EDA veterans, California-based IC Manage made a business out of providing an efficient design and IP management platform for companies to collaborate on single and multi-site designs, supporting and tracking the design data exchange between project managers, IP owners, chip designers, and verification teams.

Now, by sweeping through the big data generated by multi-vendor tool suites as a design is being simulated, verified and debugged, the company says it can speed up verification analytics by 10 to 100x while being able to pin-point verification bottlenecks tied to particular design changes, helping companies allocate their resources in near real-time to accelerate their verification schedules.

Catching up with Dean Drako, IC Manage's President and CEO, also co-founder of the company, eeNews Europe got a glimpse of how this all got started.

"Up until 2014/2015, we were a design management company, we would manage all the data necessary to get the chips out of the doors. There are literally millions of millions of files that are checked, need editing, then engineers must ensure that the right files are used in their final design" explained Drako.

"Around 2014, we started working on the big data churned out by the different EDA tools. It is very different from the data of the chip itself which is very structured and hierarchized, with transistors and gates and IP blocks. When doing all the simulations, synthesis and verification runs, you generate a tremendous number of log files. That runtime data, normally you don't have a place to store it, you generate a Gigabyte file, look at it, if it fails you delete it and do another run. That data is very unstructured, can come from many different tools, it just looks like massive text files" told us Drako.

“Semiconductor companies can run hundreds of thousands of regressions a day, and existing methodologies are not scaling with the increasing data” he says, that's gigabytes of data produced every day.

"By looking into these files, we had a very special goal, to optimize resource allocations. If you think of complex chip designs, you have thousands of engineers, millions of modules. Performing analytics on this data, we bring a methodology to know if you are on schedule. We can also compare with previous design efforts to get an insight on how the design is progressing".