February - Volume 6, ISSUE 1
“One of my favorite things about this week (DVCon) is the opportunity to meet new friends (and catch up with old friends, too), and I’d love to hear in person what you think about the conference, Verification Horizons, or the industry in general.”
Tom Fitzpatrick, Editor and Verification Technologist
February 2010 Issue Articles
- Using Questa inFact with OVM Sequences
- A Strong Foundation for Design, Verification, and Firmware Requires an Automation Tool for Register Management
- Transaction-Based Testbench Methods Speed Veloce Hardware Acceleration
- Verification Management Eases Those Re-spin Worries
- What’s New with the Verification Academy?
- Reuse in the Real World - Proving the Accellera VIP Interoperability Kit
- Converting from VMM to OVM - A Case Study
- Agile Transformation in IC Development
Using Questa inFact with OVM Sequences
Most OVM users understand how OVM sequences can be used for stimulus generation, implemented as either a directed or constrained random test. However, they may not be aware that this same OVM sequence construct is supported by Questa inFact intelligent testbench automation.
This article examines how Questa inFact’s coverage driven stimulus generation solution can be deployed in an OVM sequence environment and compares simulation performance between sequences developed using a constrained random methodology and Questa inFact.
A Strong Foundation for Design, Verification, and Firmware Requires an Automation Tool for Register Management
Register and Memory map management has become very important especially for complex ASICs and FPGAs. Gone are the days of describing each register by hand in various different places in the design cycle. Such an error prone and time consuming approach is now, thankfully, replaced by off-the-shelf automation. We automated our register management process using IDesignSpec (IDS). It has enabled us to describe the registers along with the functional specification in a document and using an editor plug-in automatically generated a variety of outputs. Further we were able to generate new outputs that fit precisely in our existing flows using TCL scripts, thereby completely automating our processes.
This article details experiences of using IDS for building a strong, error-free foundation for design, verification, firmware and not spending time and resources in doing it.
Transaction-Based Testbench Methods Speed Veloce Hardware Acceleration
In the early 1990s, simulation accelerators from IKOS and Zycad were prevalent on the verification landscape. Offering significantly better performance than software simulators, both companies carved out a profitable existence speeding simulation on large designs. But the rapid advance in workstation performance and simulation efficiency closed the gap, and by the early 2000, the accelerators had vanished. IKOS made the transition to hardware emulation with its popular VStation series. Zycad was unable to muster the financial resource to make the same leap.
Hardware emulators from Quickturn, IKOS and Mentor Graphics flourished at the turn of the century, but what really distinguished them from simulation accelerators? Two major factors come to mind. First, emulators didn’t try to be simulators which freed them to run much faster. They were 2-state machines that shunned compute intensive features like timing and fault analysis. Second, they severed ties to workstation-hosted testbenches in favor of running stand-alone (target-less) or wired to physical hardware.
Verification Management Eases Those Re-spin Worries
When verification is not under control, project schedules slip, quality is jeopardized and the risk of re-spins soars. These less-than-stellar outcomes seem to be happening more and more often. First-time success with silicon is waning, down from nearly 40% in 2002 to less than 30% in 2007, according to an independent verification survey funded by Mentor Graphics. Re-spins are mainly due to functional or logical flaws in the design, which suggests an increasing number of problems in the overall verification management process. Among such problems: a dearth of tools that can allow the specification to drive the process and can manage the volumes of data generated during verification.
What’s needed is a common platform and environment that provides all parties – system architects, software engineers, designers and verification specialists – with real-time visibility into the project. And not just to the verification plan, but also to the specifications and the design, both of which tend to change through time. There are three dimensions to any IC design project: the process, the tools and the data. Any comprehensive approach to verification management needs to handle them all.
What’s New with the Verification Academy?
These are exciting times for verification—and in particular, these are exciting times for the Verification Academy. This month we are launching our most ambitious release to date, which consists of the following new content:
- We are augmenting our existing FPGA Verification module with three new sessions.
- We are releasing a new module titled OVM Basics.
Reuse in the Real World - Proving the Accellera VIP Interoperability Kit
The SystemVerilog functional verification community has two open-source libraries and accompanying methodologies: the Open Verification Methodology (OVM) and Verification Methodology Manual (VMM). Both OVM and VMM are intended to help designers more quickly develop verification IP (VIP) that is inherently interoperable with other IP developed in a homogenous environment. Even with these libraries’ benefit, developing a typical VIP from scratch is still time-consuming and error-prone, and integration of immature components injects significant risks to a project. To mitigate those risks as much as possible, the verification architect is compelled to find and integrate existing VIP.
The choices can be very limiting if one is forced to work in homogenous environments—OVM or VMM, but not both. So, when Accellera VIP Technical Steering Committee released a beta version of the OVM-VMM Interoperability Kit, it theoretically expanded the pool of VIP available for integration in a functional verification environment. This paper describes the effort and results of testing that hypothesis in the real world.
Converting from VMM to OVM - A Case Study
Following several requests from our customers for an OVM type testbench it was decided to put a case study together to highlight the steps we took to convert our existing VMM testbench solution to OVM. This would give engineers a good understanding of the conversion process.
However, the conversion raised a couple of initial questions:
- Could the same VMM Testbench and directory structure be reused in an OVM environment?
- How easy would this be given that they are two different verification methodologies?
- How easy would it be to develop an OVM testbench solution from scratch if reuse of our VMM structure was not an option?
- Couldn’t we just use an OVM wrapper and run VMM below the surface?
- How compliant would OVM be with Synopsys VCS users?
In this article, we discuss how these questions were answered and the final approach that we took to develop our OVM solution. We conclude by highlighting our results from the conversion and describing the benefits from having an OVM testbench solution in place. For the purposes of explanation we have used the UART as an example in the following code snippets.
Agile Transformation in IC Development
There are opportunities for positive transformation in any IC development team. While people and process related initiatives can have a tremendous complementary impact to the technical advancements already occurring in IC development, their value is often overlooked. Technology driven advancements such as coverage driven verification and the proliferation of OVM as a functional verification platform have been critical to keeping pace with exploding design size and complexity. However the difference between success and failure will frequently depend on how effectively people can embrace these new technologies, and what processes are put in place to ensure their optimal use.
The technology itself is rarely the decisive factor for a successful or failed product delivery. In the world of Agile software development delivering value to a customer is the primary objective, and teamwork and efficient processes are the essential components. This article identifies the characteristics common to effective agile development teams, and how a team that embraces these characteristics–in combination with using cutting edge technology–can accelerate their schedule, improve their