The verification industry is facing some key challenges as systems become increasingly complex. After a successful DVcon conference in Europe this month, two leading verification engineers look at the trends in the industry, from the emergence of Python as a language for verification and design to the use of instrumentation and digital twins for validation of complex systems.
“As the semiconductor industry moves into more complex devices, we have better mastered the design and verification cycle,” said Martin Barnasconi, Technical Director System Design and Verification Methodologies at NXP. He is the Technical Committee Chair at Accellera and Global Co-ordinator of all the DVCon shows internationally.
“Yes, there is a challenge on the physical phenomena, as FINFETs are not as ideal as previous transistors so there are more analogue effects popping up for verification. But on a bigger scale we have mastered that, and the community is continuously learning.”
To a certain extent the verification industry has matured with the latest version of the UVM standard, he says, but there are key lessons going forwards. Some of these were highlighted in the keynote session at the conference.
“We had a fractured landscape in the 1990s with multiple languages popping but the industry has learned and consolidated all that knowledge into a true verification methodology so the industry has matured over the last decade into UVM – that’s a great step forward,” he said. “However a single language and a tool doesn’t solve the verification problems, you need to change the work flows, libraries and an IEEE standard for functional verification at the block level rather than the system level. The first release of UVM was 2017 and the 2020 version came out a month ago so there are still improvements being made,” he said.