Future trends for verification: Page 3 of 4

November 13, 2020 //By Nick Flaherty
Future trends for verification
Martin Barnasconi from NXP and Alex Rath from Infineon Technologies talk to Nick Flaherty about the challenges for the verification industry revealed at DVcon Europe, from UVM and Python to digital twins and instrumentation.

Instrumentation

A different approach that was highlighted at the conference was the use of instrumentation. Adding hardware to complex chips to monitor performance during development and testing can significantly reduce validation time. It can also be used throughout the lifecycle of a system for predictive maintenance. This has been the driver for the recent acquisitions of UltraSoc by Mentor and, yesterday, of Moortec by Synopsys.    

“We need to better leverage different technologies,” said Barnasconi. “Yes we can virtualise and simulate a lot but you need evidence that it works in real life and connects to the FPGA prototyping world with a seamless flow. There standards are key, for preparing a simulation environment into the hardware world.”

“There are already early standards with UVM test benches to use on an emulator and while this has been done for decades, the APIs are not usually public,” said Rath.

“Here you see the concepts of testing and calibration hardware routines are more software now and that’s an interesting trend. If you want to add hardware for instrumentation and there’s a cost to that. Instrumentation can be used as a verification technology, the question is whether you want to – you want to do as much pre-silicon as possible,” he said.

“Then it’s a question of how clever the concept engineers are. For example, can I reuse the instrumentation in mission mode - that’s probably a trend that’s coming more and more, there are on chip ADCs tested by an on chip processor, and this will be done more and more especially considering the test time is expensive.

Revolutionary shift

This higher level verification and validation will have a dramatic impact on the design cycle.

“With functional safety there are specific requirements on how the chip architecture is defined with clear definitions of who is responsible for each function, for test, for safety, so we need to do more thinking up front and that needs to cope with verification and validation,” said Barnasconi. “As an industry we create silicon which is great but is the architecture optimised for these kinds of things?

“Functional safety and how that is developed and architected there has already been a lot of thinking, the challenge is that with analogue moving to digital, and digital into software we need to bring that into the equation of functional safety. Is it still robust enough, did we look at all the expected and unexpected cases, how do we translate ‘any cases into test vectors’?”

“Sometimes you need to go back to the drawing board and ask if it is suitable for verification and validation. The evolution of the product is not enough, it needs a more revolutionary design style,” he said.  

Next: Multiple languages for verification

Picture: 
Verification engineers Martin Barnasconi from NXP and Alex Rath from Infineon Technologies

Vous êtes certain ?

Si vous désactivez les cookies, vous ne pouvez plus naviguer sur le site.

Vous allez être rediriger vers Google.