Tackling the Wild West of verification

Tackling the Wild West of verification

Feature articles |
By Nick Flaherty

Verification has emerged from the Covid pandemic as a Wild West of challenges. Nick Flaherty talks to Martin Barnasconi of NXP and Mark Burton of Qualcomm and consultancy Greensocs as they look to the tenth anniversary of the DVcon Europe conference on trends for the verification industry.

A world moving to digital twins is looking to combine high level models with silicon verification to speed up chip development, particularly for automotive systems. This is bringing significant challenges as European car makers set up their own chip design teams and huge software teams.

“The biggest thing was that we were able to get together and thing that struck me was there were more people than expected,  400 people, which is the same level as pre-covid,” said Barnasconi, chair of the DVcon Europe.

“What I saw is that everyone was trying new things. There are extensions to existing tools, connecting to emulators, to the Python world, to the software world, and there is divergence in how people are implementing these extensions. Just using the UVM test bench will not be enough,” he said.

“So there is bit of a Wild West for the exploration phase and sooner or later this will consolidate into standardising in certain extensions. The industry has accepted there is a value in UVM but people know where it starts and where it stops and what people need to do.”

“You also see more and more formal methods popping up, becoming more mainstream,” he said.

 Some things haven’t changed, particularly the relationship of verification engineers with the large tier one automotive suppliers who are looking at designing their own chips and are also using high level models.

“This time we had Mercedes still asking for the same kind of things that Audi asked for three years ago, but the exact direction is different,” said Mark Burton of Qualcomm. “We need to have models first but the companies are in transition to the digital twin and system modelling approach so the OEMs are knocking on the doors of the semiconductor makers to make that happen.”

“We know that the value chain in automotive has its strengths and weaknesses. It is very rigid and fixed but connecting all the companies together is a challenge on its own so they are reaching out to the verification community. That’s a very interesting thing that we have seen before,“ he said.

There is change in the air, The automotive industry will drive things for the next few years,” said Burton. “Functional verification to the specification is well covered but there are moves to validation and understanding the use cases and whether the chips comply to those.

To virtualise the car or even the city then how does the verification come into the digital twin?

“We use vanilla stimuli in plain language and in that sense the current verification environments are very basic. How can we connect the software teams and system teams to the verification folks to work together,” says Burton.

One pillar is the virtual prototyping and the more traditional verification approaches.

“In automotive we are a world away from providing the verification that is needed. It is technically hard. The thing that is causing me the most obvious pain is integrating systems of systems, integrating cabin electronics, ADAS, braking, and then you pump a load of software engineers into the OEMs. There are some particular technological points that will explode and one of those is hypervisors,” said Burton.

“The challenge is connecting the system world with the verification world and getting the software system and hardware people in the same room. That is a challenge as they are not at the same level of abstraction on how to make a model or use a model so it’s a whole journey that will take years. The hardware/software co-development problems have not been solved, there’s even the Mixed signal fabric, sensors, mechanical issues. We are barely able to run a fast digital simulation let alone the mixed signal or analog so we definitely have challenges in the modelling space,” said Barnasconi.

“That’s where Accellera is important so if we need a language or standard this is a hidden topic. Clearly industries are miles apart, how do we bring them together,” he said.

One of the other key trends is the increasing processing power of the cloud, but this also comes with challenges.

“We have things in the System C standard and IEE1666 and that will include parallel simulation,” said Barnasconi. Burton points out the security and legal issues of collaboration in the cloud as people still want their own secure zones in the cloud. This, coupled with the complexity of the system verification, is driving a move to distributed simulation or federated simulation with multiple environments working together for systems of systems, including software stacks in all kinds of forms.

Links to research are also important.

“We are also looking at ways to get a better connection to the research world, quantum, AI, machine learning, it would be good to establish stronger connections with the research happening around us and what is expected in the industry in terms of skills so the next generation of students are more hands on.”

DVcon Europe returns to Munich on November 14th and 15th 2023.

If you enjoyed this article, you will like the following ones: don't miss them by subscribing to :    eeNews on Google News


Linked Articles