“With the standardisation we made a big step forward. What we see in general is designs come out with less bugs,” said Alex Rath, Director Concept, Digital Design and Verification at Infineon and Technical Program Chair of DVCon Europe. “The more controversial part is if you look at the amount of computational and human resource that the industry is still throwing at the problem, this is still tremendous, and that’s a cost.”
The standard is of course being extended to address the needs of verification engineers, and this is leading to more fragmentation.
“What we have seen is a strong UVM baseline but we saw papers working as an extension, on top of UVM, using different languages and with this exploration we start to diverge again as people are going to all kinds of non-standard things,” said Barnasconi. “That’s a very interesting trend and good to explore the missing elements in the verification framework, such as mixed signal, software, testing for functional safety and security. Teams of engineers have been trying to find ways to extend UVM or formal methods to master the new requirements in the last couple of years. They need more than is currently available in the standard or the tools so its going to be more fragmented. Five year or ten years from now I would expect a consolidation of these ideas into the next big revision of a standard.”
The need for software verification on top of hardware is an increasingly key issue and this has been driving the use of digital twins. These are software simulation models, sometimes up to the level of a complete system that can be used for validation, rather than verification.
“That is another angle where there is friction between the UVM hardware oriented methodology and how software verification is done with unit test so we need to move from a hardware centric approach to co-verification,” said Barnasconi. “We have been talking about this for a decade. Is there a common way of working? Not really.”
“Digital twin is more and more emerging but there is an unanswered question,” says Rath. “Is it a verification vehicle or a system validation or virtual prototyping vehicle – this question is not answered for me yet. To build a working digital twin you need a lot of high level abstraction and this perhaps eclipses the bugs that you wanted to find.”
The system level verification is a key area for the Accellera consortium with SystemC language.
“The process of the Digital Twin is to combine the different stages of the lifecycle, but equally important is the digital trends, the connection between all the steps, connecting to the verification environment,” said Barnasconi. “The industry is not there yet with verification at the system level. If you want to do a real top down design flow with a digital twin, having the twin is one thing, but you need to verify it works in the environment so my verification environment is orders of magnitude more complex.
“It will be a real challenge to move existing verification models to the system level and I think there we have huge challenges,” he said.
Verification is currently still very bounded with clear, fixed use cases and clear standards to verify against. The move to digital wins, especially in areas such as autonomous vehicles, is a real challenge
“For autonomous vehicles, there is no standard or structure, so how do you deal with uncertainty and chaos,” he said. “How do I translate requirements into use cases? Although we have a good track record in developing languages etc are we ready for this challenges, addressing real world scenarios, whether it’s a chip or a car, we still have a long way to go.”
“Is this a verification vehicle or is it something else,” says Rath. “With more abstraction you are validating, not verifying the software or the hardware. So that’s an interesting question, where will be end up.”