Sign In
Forgot Password?
Sign In | | Create Account

Horizons Newsletter - June 2008

“If last year’s DAC issue was “super-sized,” then I am pleased to welcome you to this year’s super-duper-sized DAC issue of Verification Horizons.

As I write this note, I’ve just gotten home from coaching my son’s baseball team this evening. It was a great game, and there are a number of things I could write about it. But given that we achieved our first win of the season (after three losses), I find myself thinking about “the thrill of victory.” My coaching philosophy is simply for the boys to do their best and have fun. My secondary philosophy is that it’s always more fun when you win, so I always encourage the boys to work together, stay focused on the game, and think about what they need to do on every play.

In gathering my thoughts to write to you tonight, I couldn’t help but reflect back a few months to Dr. Wally Rhines’ keynote address at DVCon, entitled “Ending Endless Verification.” The keynote has served as a “game plan” for us in Design Verification Technology here at Mentor, and it should come as no surprise that this issue of Verification Horizons highlights each of the featured technologies discussed by Dr. Rhines, ncluding several articles that highlight the Open Verification Methodology (OVM).”

Tom Fitzpatrick, Editor and Verification Technologist

June 2008 Issue Articles

Achieving DO-254 Design Assurance using Advanced Verification Methods

DO-254 is a standard enforced by the FAA that requires certification of avionics suppliers’ designs and design processes to ensure reliability of airborne systems. Rockwell Collins had been using a traditional directed-testing approach to achieve DO-254 compliance [1]. This is time consuming and, as their designs were becoming more complex, they wanted to take advantage of the productivity gains a modern constrained-random coverage-driven environment provides, but still ensure the needs of the DO-254 process could be met.

A methodology built using SystemVerilog with Mentor’s Advanced Verification Methodology (AVM) [4] was assembled and a flow developed that linked the formal requirements documents through the verification management tools in Questa. This allowed coverage data to be used to satisfy the DO-254 process, and greatly reduced the verification effort required. The target design, an FPGA DMA engine, was recently certified using the flow developed to meet DO-254 Level A compliance, the highest level that is used in mission-critical systems.

The process flow developed for this project will be of interest to anyone who needs high levels of verification confidence with the productivity gains that methodologies such as the AVM or OVM provide.

Request June issue today!

Ending Endless Verification with Questa Formal Verification

Dr. Wally Rhines noted during his DVCon 2008 keynote speech that today’s approach to verification is a frustrating, open-loop process that often does not end—even after the integrated circuit ships. To keep pace with Moore’s law, which has enabled escalating product feature demands, verification efficiencies must increase by at least 100x. Obviously, throwing more engineers and computers at the problem has not provided a scalable solution.

The industry must move away from the model that adds more cycles of verification, to a model that adds more verification per cycles (that is, maximizing the meaningful cycles per second). Functional formal verification (such as Mentor Graphics’ Questa™ Formal Verification tool), when effectively used, offers significant improvements in verification productivity. The confusion most engineers face when considering functional formal verification is in understanding how to effectively get started.

Traditionally, applying formal property checking has been viewed as an orthogonal process to simulation-based approaches. However, my philosophy is that the two processes are actually complementary. The key to successfully integrating formal into a verification flow is first understanding the where, when, and how to apply it.

Request June issue today!

Intelligent Testbench Automation Turbo-Charges Simulation

Several years ago advances in testbench automation enabled verification engineers to test more functionality by dramatically increasing the quantity of testbench sequences for simulation. Through clever randomization techniques, constrained by algebraic expressions, verification teams were able to create testbench programs that generated many times more sequences than directed testbench programs could. While actual testbench programming time savings and simulation efficiency were hotly debated topics, few argued that constrained random test generation did not generate orders of magnitude more testbench sequences.

However, addressing the testbench sequence generation challenge through quantitative means (i.e. “more” sequences) caused corresponding challenges during the simulation and debug phases of functional verification. Even when constrained by algebraic expressions, random techniques tend to increasingly generate redundant testbench
sequences during the course of simulation, thus more simulators are required to run for longer periods of time to achieve verification goals. In addition, even when using “directed” constrained random techniques, it is difficult to pre-condition testbenches to “target” interesting functionality early in the simulation. The mathematical characteristics of constrained random testing that enable the generation of sequences the verification
engineer “hadn’t thought of”, are the very same characteristics that make it difficult to control and direct the sequence generation process.

Request June issue today!

Using Questa Multi-View Verification Components and OVM for AXI Verification

On February 18, 2008, Mentor Graphics introduced a new generation of Verification IP called Multi-View Verification Components (MVC). The MVC was created using Mentor’s unique Multi-View technology. Each MVC component is a single model that supports the complete verification effort at the system, transaction, and register transfer levels. The MVC supports automatic stimulus generation, reference checking, and coverage measurements for popular protocols, such as AMBA™ with AHB, APB/APB3, and AXI.

This article highlights the ways to create a reusable SystemVerilog and OVM-based, constrained-random verification environment for AMBA3 AXI using the AXI MVC. More detailed information can be found in the MVC Databook. MVC enables fast test development for all aspects of the AXI protocol and provides all the necessary SystemVerilog classes, interfaces, and tasks required for both directed and constrained-random testing, as required for AXI master and slave unit verification at the RTL
and TLM abstraction levels.

Request June issue today!

What is a Standard?

The engineering and computing worlds are filled with standards - from COBOL to SystemVerilog, from RS232 to AMBA. As engineers, not a day goes by when we don’t apply a standard of some sort in our work.

What makes a standard a standard? The simple but maybe not so obvious answer is that something is a standard if everyone agrees it is. Is that enough? Who is everyone? To answer these questions we’ll take a brief look at a few standards and see how they came to be considered standards.

In the functional verification world, the Open Verification Methodology (OVM) was recently released as a joint production of Cadence Design Systems and Mentor Graphics Corporation. As a verification methodology for SystemVerilog users1, OVM generated a lot of buzz at the recent DVCon conference in San Jose, CA. Although just released the first week of January 2008, as of the end of May, over 3000 people have downloaded copies. In this article we show parallels between OVM and other well known standards and argue that OVM is on the same trajectory toward standardization.

Request June issue today!

Firmware Driven OVM Testbench

The Open Verification Methodology promotes a well defined SystemVerilog transaction-level interface, inviting integration of a host of verification technologies. Firmware has proven effective for functional verification of embedded hardware, so it follows that OVM integration of a firmware execution environment will advance the verification of embedded systems.

To this end, Mentor Graphics has added OVMcompliant interfaces to the Seamless® HW/SW co-simulation tool. This article covers the firmware execution modes, supported processors, and interface points of the Seamless/OVM integration.

Request June issue today!

Design Verification Using Questa and the Open Verification Methodology - A PSI Engineer’s Point of View

This article discusses, from a design verification engineer’s point of view, the benefits of using Questa and the Open Verification Methodology in the verification process. This article shows how Questa and its advanced features such as integrated verification management tools and integrated transaction viewing, can help to achieve verification plan targets.

Questa also enables custom flow integration, such as the PSI-E verification management flow. Coupling with a methodology like OVM, it is possible to deliver a complete reusable verification environment in an efficient way whatever the project. OVM is a standardized verification methodology, enabling the verification environment to be reused, even in a different EDA vendor flow.

OVM provides many verification components and gives the verification engineer a way to think about how to verify and what to verify instead of thinking how to write verification components. The only things to code are drivers, which depend on the device under test. In this article, we are using a simple example: the test of an UART IP connected to the APB bus of a LEON-2 system-on-chip. We will explain how verification was done before using OVM and Hardware Verification Language. We will then explain how OVM and Questa helps us to achieve better results while speeding-up the verification process by using the SystemVerilog DPI feature to replace the fullyfunctional LEON processor model and by re-using our HDL verification IPs.

Request June issue today!

A Practical Guide to OVM – Part 1

This is the first in a series of articles helping you get started with OVM, the Open Verification Methodology for functional verification using SystemVerilog. OVM is supported by a library of SystemVerilog classes. The emphasis in these articles is on getting your code to run, while at the same time coming to understanding more about the structure and purpose of the OVM classes.

OVM was created by Mentor Graphics and Cadence based on existing verification methodologies originating within those two companies, including Mentor’s AVM, and consists of SystemVerilog code and documentation supplied under the Apache open-source license. The official release can be obtained from the website, www.ovmworld.org. The overall architecture of OVM is well described in the Datasheet and White Paper available from that website. This article assumes you have at least some familiarity with SystemVerilog, with constrained random simulation, and with objectoriented programming.

Request February issue today!

Dynamic Construction and Configuration of Testbenches

There are a number of requirements in developing an advanced testbench. Such requirements include making the testbench flexible and reusable because with complex designs in general we spend as much or more time developing the verification environment and testing as we do developing the DUT. It has been said that any testbench is reusable; it just depends upon how much effort we are willing to put into adapting it for reuse! Given that however, there are concepts that go into making a testbench that is reusable with reasonable effort.

  1. Abstract, standardized communication between testbench components
  2. Testbench components with standardized API’s
  3. Standardized transactions
  4. Encapsulation
  5. Dynamic (run-time) construction of testbench topology
  6. Dynamic (run-time) configuration of testbench topology and parameters
  7. Test as top level class
  8. Stimulus generation separate from testbench structure
  9. Analysis components

We will primarily address dynamic (run-time) construction of testbench topology and dynamic (run-time) configuration of testbench topology and parameters in this article. In doing so we will lightly touch on test as top level class and stimulus generation separate from testbench structure as these are related topics.

Request February issue today!

OVM Productivity using EZVerify

Verification of a chip is easily the most time-consuming task confronting the product team. Increasingly, verification engineers are using innovative technologies and newer methodologies to achieve satisfactory functional verification. SystemVerilog is fast becoming the language of choice for implementing verification projects. Its rich set of verification-friendly constructs, IEEE standard status, and universal support across multiple vendor platforms warrants its overwhelming acceptance.

Verification-specific constructs in SystemVerilog include objectoriented data structures, support for constrained random stimulus generation, assertion specification and functional coverage modeling. The Open Verification Methodology (OVM) uses the latest SystemVerilog constructs to provide users with a powerful verification infrastructure. OVM-based team will benefit greatly from productivity solutions that will analyze user files for errors in use model/implementation as well as provide ways to better understand OVM methodology and hierarchy.

VeriEZ’s EZVerify is a unique tool suite that offers OVM users a static analysis tool (EZCheck) to perform over 30 OVM rule-checks and a knowledge extraction tool (EZReport) to create persistent documents that outline hierarchy and connectivity.

Request February issue today!

Tribal Knowledge - Requirements - Centric Verification

  • What is a design requirement?

  • What is a verification requirement?

  • How do I define a generation stimulus sequence or scenario?

  • How do I decide what to check in a scoreboard, and what to check in an assertion?

  • How do I define coverage points?

In an ideal world, a design specification would detail all the necessary requirements of the design under test (DUT). Yet, even if this were the case, it would most probably still not clarify everything that is needed for a design team to implement the requirements, and for the verification team to verify adherence to them. For instance, the design specification might give general information about an interface, but a second protocol
specification might be needed to solidify the necessary details of that requirement. Even with multiple documents to reference, it is quite often still unclear and ambiguous.

Typically, the necessary information is in someone’s head and a series of discussions are needed to refine the requirement for actual use. Often one big ambiguous requirement is broken up into several refined requirements. On top of this, some
requirements are not evident upfront at all, and they tend to develop only as you progress along with the project. No one had thought of them until the actual implementation was underway. Clearly, it would be beneficial to have some structured way to gather and maintain design requirements.

Request June issue today!

 
Online Chat