The state of verification 2021

The state of verification 2021
Feature articles |
Leading verification engineers talk to Nick Flaherty about the state of the industry and the rise of virtual models, AI and the cloud
By Nick Flaherty

Share:

The recent Design and Verification Conference & Exhibition Europe (DVCon Europe) saw registrations reach a record high of 465, covering 149 organisations and 24 countries.  

The show, organised with the Accellera standards group for system-level design, modeling, and verification standards, featured a full Virtual Reality (VR) 3D world, modelled on a conference centre in Munich, with 24 papers, 13 tutorials and two panels. 

“This year we really changed the way in which people could talk to each other at the ‘virtual conference’ and it was very gratifying to hear from fellow participants that they had never experienced such a rich virtual environment,” said Mark Burton, Virtual Platform Chair and founder and CEO of French SystemC and virtual platform expert GreenSoC.

Burton joined with vice chair Joachim Geishauser from NXP Semiconductors, and technical programme chair Alexander Rath from Infineon to assess the trends in the verification industry for eeNews Europe. This follows announcements from ARM ahead of the conference on new ways to access virtual prototype for software development and continuous integration and continuous development (CI/CD) methodologies.

Related articles

“We had a lot of papers about verification of bigger SoCs where the scalability of the test bench and the configurability was an issue so its about how to architect a verification,” said Rath. “That’s something perhaps even the UVM standardisation committee perhaps needs to look at.”

“We are pretty good at verifying blocks and midsized chips but Europe does big mixed signal chips for automotive where you have many challenges apart from the SOCs just being big, with analog, digital and software, it’s an enormous systemic complexity that is hard to quantify,“ he said. “Then on top of that there are the challenges of functional safety, how to make the verification systemically consistent so that a car will not do any harm

“In automotive world especially there are rising security standards with new standards. That’s kind of funny as security and safety have an overlap but in other cases they are entirely separate – you can’t just shut everything off if you get a security alert when you are running 150km/h on the highway. One way is to have a very rigid thorough engineering approach. Another approach is virtual prototyping,” he said.

“I was particularly struck by the focus on virtual prototypes that up until now have been seen as getting software engineers up and running quicker for continuous integration (CI) and test which is booming right now but also being used to address safety and security issues,” said Mark Burton at GreenSoC.

“I feel like it’s taken the next step and we are seeing a lot in automotive. There’s a lot of silos in the industry. Aerospace and defence have been doing safety and security for donkeys years and they have their own methodologies and languages and their own virtual platforms and their own standards. Airbus for example would love to use commercial off the shelf (COTS) devices and are asking for models to particular standards and this is a live action for Accellera to address.”

“Integrating security depends on the virtual prototype,” he says. “There are hybrid models with FPGAs and SystemC models and there’s a lot of flexibility in the software virtual prototype to blow up a design it up in any way. But the way it responds is not necessarily the way the real hardware will respond. Companies like Airbus have been doing this for decades and there is a lot of knowledge there and there is a lot of communication with automotive designers.”

He sees the role of virtual prototyping changing. “In the past it’s been an additional check rather than replacing a check,” he said. “It means I can develop the verification quicker on the virtual prototype for a minimum set of tests to prove the original functional definition.”

“If there is a safety or security alert, virtual prototypes are good to make sure the system as a whole still runs correctly,” said Rath. “What you can’t really do on a virtual prototype is intrusion testing, detecting hacks or safety problems, this needs to be verified on a level closer to the implementation

“The virtual platforms can help you get to the right set of tests, and as an additional test in the virtual environment,” said Burton.

“There are two issues,” says Geishauser. “Security built into software that can be verified with virtual prototype but there is also security built into the hardware and this needs assessment with formal methods.”

“The other factor that is also driving the designs is the increasing size of devices,” he said. “The tools are not able to handle the size and you need to partition the complete SoC and that forces the verification to align with the partitions and then use a modular hierarchical verification.”

Next: Verification in the cloud


Moving designs to the cloud is a way to address this. “That is key to the move to cloud-based design tools, although there is reluctance,” said Geishauser.

“I’m not convinced the movement to the cloud is simulating a device across millions of nodes,” said Burton, “Its more about the CI testing every time they commit code to the repository and let Amazon deal with that – its more about lots of machines running tests in a continual fashion. Its more about running lots of tests without having to worry about scaling.”

“The other part of cloud that’s interesting there is now FPGA in the cloud and people are very interested in this, and these boards are expensive and you can try it out in the cloud and then bring it back inhouse, I see the cloud as an experimentation zone.”

“The security is an issue,” says Geishauser. “You don’t want to push any data out that contains security information and you need a special cloud for that and there are a lot of legal issues around that,

“We have had the same discussions at Infineon, with security, cloud is problematic,” said Rath.

Machine learning for verification

“ML and AI has been out there for many years but hasn’t been taken up heavily and I don’t see it in verification yet, There are some small areas, but there is still applications that need to be found,” said Rath. “I definitely see potential in ML but have seen that for quite some time now.”

“Its still always going to happen tomorrow. There are some benefits. Verification engineers are spending a lot of time covering the last few percent of coverage with directed tests. I think some AI could help there based on the previous regressions to tweak the directed stimulus tests more so the number of tests comes down,” he said.

“There are some EDA solutions targeting this, whether its really productively useful we still need to see but this could be one area – that could really bring down the time and engineering cost, and on top of that writing directed tests has to happen shortly before tapeout so that gives a schedule problem

“It is definitely an upcoming area but I don’t think we are there yet,” added Burton.

“There are more fundamental problems we need to face which is more to get hardware and software verification aligned in doing the verification as well as processes and interlinking the processes, to get all those things together in a complete functional system solution is still one of the bigger miracles,” said Geishauser.

“What AI could do today is to tweak the randomisation of the tests based on the regression results but it doesn’t scale to a level where it closes the skills gap,” said Rath.

Next: System level AI design


There is also a disconnect between the AI work done at the system level and the requirements for verification.

“We need an EDA solution that a verification engineer can operate. We will probably not have verification engineers working with TensorFlow or building models,” said Rath at Infineon.

“The chip makers are bringing the AI in house and that plays to the cloud story as there is the degree to the verification of those algorithms in the cloud but those need Ai acceleration in the cloud,” said Burton.

“EDA tools face a chicken and egg situation as there are a lot of people thinking at the RTL level and only if they are forced do they move to the system level, ” said Geishauser at NXP. “The EDA companies also have their own silos for RTL and system level customers that hardly talk to each other and until we get to the point of real top down design that can guarantee the behavioural models and RTL are behaving in the same way it will not change that fast.”

This is driving the use of FPGAs for emulation for verification, he says. “There’s still a lot of RTL designers and FPGA emulation is taking off as a result as it fits to the RTL thinking.”

Burton agrees.

“People want to use FPGAs as they are accurate but they are really slow, a factor of 100 slower compared to a CPU executing a CPU and a virtual model in the QEMU open source virtualiser which is what we use – it’s very efficient and very fast.”

“But at the end of the day you have to live with fact that it’s a particular tool, it’s a spanner not a hammer, its a tool that will help with certain things and there’s no point in trying to pretend its an exact representation of the hardware,” he said. “There is definitely a step to CI and test for virtual prototype but there are so many companies that bolt virtual platforms on afterwards for software engineers to do high level verification and test but we still haven’t included this in the flow.”

The use of the Python programming language, highlighted last year, continues to grow as a replacement for other languages such as TICL.

“Python for verification is also a discussion that frequently arises,” said Rath. “Now instead of TICL there’s a Python interface so it will be interesting to see. A lot of domain specific languages could be replaced by Python and libraries – do we really need System Verilog and UVM – couldn’t we just so that in Python?” he asks.

There are moves to boost the use of the Python.

“We developed PySysC at SystemC.org which was donated to Accellera earlier this year and you can have that execute SystemC code or configure systemC code for the best of both worlds,” said Burton. “That will flow into verification and there were discussions on the way in which they could use PySysC to bring those two things together for verification.”

DVcon 2022

The next DVCon Europe will be held on 6th and 7th December 2022, with SystemC Evolution Day on 8th, at the Holiday Inn Munich City Centre.

“We are proud to have achieved a record high number of registrations and, since networking is an important part of DVCon Europe, we are very happy that we’ve been able to continue to offer those opportunities over the last two years, said Sumit Jha, General Chair of the DVCon Europe steering committee for 2021 and Senior Staff Engineering Manager at Qualcomm.

The award for Best Paper at DVcon 2021 went to Ana Sanz Carretero, Katherine Garden and Wei Wei Cheong from Xilinx for the paper entitled, “Testbench Flexibility as a Foundation for Success”, while the Best Poster was won by Caglayan Yalcin and Aileen McCabe of Qualcomm, for “An Analysis of Stimulus Techniques for Efficient Functional Coverage Closure”.

Related verification articles

Other articles on eeNews Europe

Linked Articles
eeNews Europe
10s