The verification industry is facing some key challenges as systems become increasingly complex. After a successful DVcon conference in Europe this month, two leading verification engineers look at the trends in the industry, from the emergence of Python as a language for verification and design to the use of instrumentation and digital twins for validation of complex systems.
“As the semiconductor industry moves into more complex devices, we have better mastered the design and verification cycle,” said Martin Barnasconi, Technical Director System Design and Verification Methodologies at NXP. He is the Technical Committee Chair at Accellera and Global Co-ordinator of all the DVCon shows internationally.
“Yes, there is a challenge on the physical phenomena, as FINFETs are not as ideal as previous transistors so there are more analogue effects popping up for verification. But on a bigger scale we have mastered that, and the community is continuously learning.”
To a certain extent the verification industry has matured with the latest version of the UVM standard, he says, but there are key lessons going forwards. Some of these were highlighted in the keynote session at the conference.
“We had a fractured landscape in the 1990s with multiple languages popping but the industry has learned and consolidated all that knowledge into a true verification methodology so the industry has matured over the last decade into UVM – that’s a great step forward,” he said. “However a single language and a tool doesn’t solve the verification problems, you need to change the work flows, libraries and an IEEE standard for functional verification at the block level rather than the system level. The first release of UVM was 2017 and the 2020 version came out a month ago so there are still improvements being made,” he said.
Next: Industry fragmentation
“With the standardisation we made a big step forward. What we see in general is designs come out with less bugs,” said Alex Rath, Director Concept, Digital Design and Verification at Infineon and Technical Program Chair of DVCon Europe. “The more controversial part is if you look at the amount of computational and human resource that the industry is still throwing at the problem, this is still tremendous, and that’s a cost.”
The standard is of course being extended to address the needs of verification engineers, and this is leading to more fragmentation.
“What we have seen is a strong UVM baseline but we saw papers working as an extension, on top of UVM, using different languages and with this exploration we start to diverge again as people are going to all kinds of non-standard things,” said Barnasconi. “That’s a very interesting trend and good to explore the missing elements in the verification framework, such as mixed signal, software, testing for functional safety and security. Teams of engineers have been trying to find ways to extend UVM or formal methods to master the new requirements in the last couple of years. They need more than is currently available in the standard or the tools so its going to be more fragmented. Five year or ten years from now I would expect a consolidation of these ideas into the next big revision of a standard.”
The need for software verification on top of hardware is an increasingly key issue and this has been driving the use of digital twins. These are software simulation models, sometimes up to the level of a complete system that can be used for validation, rather than verification.
“That is another angle where there is friction between the UVM hardware oriented methodology and how software verification is done with unit test so we need to move from a hardware centric approach to co-verification,” said Barnasconi. “We have been talking about this for a decade. Is there a common way of working? Not really.”
“Digital twin is more and more emerging but there is an unanswered question,” says Rath. “Is it a verification vehicle or a system validation or virtual prototyping vehicle – this question is not answered for me yet. To build a working digital twin you need a lot of high level abstraction and this perhaps eclipses the bugs that you wanted to find.”
The system level verification is a key area for the Accellera consortium with SystemC language.
“The process of the Digital Twin is to combine the different stages of the lifecycle, but equally important is the digital trends, the connection between all the steps, connecting to the verification environment,” said Barnasconi. “The industry is not there yet with verification at the system level. If you want to do a real top down design flow with a digital twin, having the twin is one thing, but you need to verify it works in the environment so my verification environment is orders of magnitude more complex.
“It will be a real challenge to move existing verification models to the system level and I think there we have huge challenges,” he said.
Verification is currently still very bounded with clear, fixed use cases and clear standards to verify against. The move to digital wins, especially in areas such as autonomous vehicles, is a real challenge
“For autonomous vehicles, there is no standard or structure, so how do you deal with uncertainty and chaos,” he said. “How do I translate requirements into use cases? Although we have a good track record in developing languages etc are we ready for this challenges, addressing real world scenarios, whether it’s a chip or a car, we still have a long way to go.”
“Is this a verification vehicle or is it something else,” says Rath. “With more abstraction you are validating, not verifying the software or the hardware. So that’s an interesting question, where will be end up.”
Next: Instrumentation for verification
A different approach that was highlighted at the conference was the use of instrumentation. Adding hardware to complex chips to monitor performance during development and testing can significantly reduce validation time. It can also be used throughout the lifecycle of a system for predictive maintenance. This has been the driver for the recent acquisitions of UltraSoc by Mentor and, yesterday, of Moortec by Synopsys.
“We need to better leverage different technologies,” said Barnasconi. “Yes we can virtualise and simulate a lot but you need evidence that it works in real life and connects to the FPGA prototyping world with a seamless flow. There standards are key, for preparing a simulation environment into the hardware world.”
“There are already early standards with UVM test benches to use on an emulator and while this has been done for decades, the APIs are not usually public,” said Rath.
“Here you see the concepts of testing and calibration hardware routines are more software now and that’s an interesting trend. If you want to add hardware for instrumentation and there’s a cost to that. Instrumentation can be used as a verification technology, the question is whether you want to – you want to do as much pre-silicon as possible,” he said.
“Then it’s a question of how clever the concept engineers are. For example, can I reuse the instrumentation in mission mode – that’s probably a trend that’s coming more and more, there are on chip ADCs tested by an on chip processor, and this will be done more and more especially considering the test time is expensive.
This higher level verification and validation will have a dramatic impact on the design cycle.
“With functional safety there are specific requirements on how the chip architecture is defined with clear definitions of who is responsible for each function, for test, for safety, so we need to do more thinking up front and that needs to cope with verification and validation,” said Barnasconi. “As an industry we create silicon which is great but is the architecture optimised for these kinds of things?
“Functional safety and how that is developed and architected there has already been a lot of thinking, the challenge is that with analogue moving to digital, and digital into software we need to bring that into the equation of functional safety. Is it still robust enough, did we look at all the expected and unexpected cases, how do we translate ‘any cases into test vectors’?”
“Sometimes you need to go back to the drawing board and ask if it is suitable for verification and validation. The evolution of the product is not enough, it needs a more revolutionary design style,” he said.
Next: Multiple languages for verification
Encouraging just one language for verification is also not a viable way forward for the combination of hardware, software and system verification, despite the advantages to engineers and tool developers.
“What we have also seen is the need for multi-language frameworks,” said Barnasconi. “Many teams are using System Verilog, VHDL and SystemC are driven from the hardware world so we are dealing with a multi-language environment and the software world is using other languages so we have an interesting challenge.”
“I don’t think we will need to consolidate into a single language but we need to come up with a way those approaches can talk to each other so is all about interfaces and communication, and there an Accelera working group on that. We need this is the verification space. We have languages that are better at the system level and these need to be integrated and have clear definitions of how the interfaces should be done. The good news is that many standards have defined transaction levels, and back to digital twin these are defined, so we are zooming in on how different entities talk to each other and we need to talk to these.”
“I see users using Python and vendor support so we are entering a new era in that sense – it is a language that is recognised and we need to incorporate it – it is a valuable asset and we need to find the right mechanism, not just a binding layer. You need to work out how Python components can work with SystemC components,” he said.
“Python is probably THE language in academia so all the freshers speakers Python,” said Rath. “Who speaks VHDL any more coming out of university? Everything coming for AI is typically built in Python so those people joining the semiconductor industry or the EDA industry its natural for them to build stuff in Python, so we will see that more as an interface to design tools.”
All of these issues need more connection between the different parts of the industry, they say.
“At DVcon we aim to give a platform for engineers, not just verification, we want more system design and software to address these kinds of topics,” said Barnasconi. “Verification needs to connect to the software flow, hardware prototyping, so we need to put this into the design eco-system. That takes time, to connect communities and industries together, and one of the high level ambitions is to have the verification and design communities to interact.”
- EUROPEAN EXASCALE SUPERCOMPUTER CHIP PROJECT UPDATES ITS ROADMAP
- STREAMING SCAN NETWORK CUTS TEST COST FOR LARGE SYSTEM-ON-CHIP DESIGNS
- ULTRASOC DEAL TRIGGERS SIEMENS TEST CHANGES
- AI SPEEDS ANALOG/MIXED-SIGNAL DESIGN
Other articles on eeNews Europe
- Sondrel tapes out its largest chip
- Tech war is on despite US election
- 3D printed Millenium Falcon is 100 microns long
- 3D printing graphene for electronic devices
- Panasonic challenges capacitor failure in graphics cards