Sign In
Forgot Password?
Sign In | | Create Account

To be the man, you've gotta beat the man!

I admit it.  I’m one of those rare adults who is actually willing to admit that he (or she) is a true fan of professional wrestling; have been since I was a kid in the early 70’s.  Over the entire course of those years, probably the greatest athlete in professional wrestling has been the recently retired  “Nature Boy” Ric Flair.  A multi-time world champ, one of Ric’s most prominent quotes has been “to be the man, you’ve gotta beat the man — whoo!”  In the world of professional wrestling, this refers to the fact that the champ only loses his title if his opponent can cleanly beat him, by pin fall or submission, in the middle of the ring.  The champ can lose by count out or disqualification, but in such situations does not lose the title.  If you want to be the champ, you have to clearly demonstrate superiority.

Lately, I’ve been thinking about this comment as it relates to the world of EDA and for me personally to the world of physical verification.  In many ways, the same concept holds.  To displace an EDA tool at a company, you have to show significant value versus the incumbant solution.  This is because any new solution will have a significant cost associated with it.  This cost is not only the potential software cost, but also includes costs to train the users, who are used to a previous alternative.  It may also require costs to different hardware or OS platforms.  The general rule of thumb in EDA is in order to show enough value to switch to a new tool, you need to show 5-10X the value of the previous tool.

But, this is a daunting task.  How do you measure and demonstrate value?  Some things are relatively easy, like how fast does it take a tool to complete a task, how much hardware (CPUs, memory and disk) does it require, …  Others are more tangible like how easy is it to use, how stable is it, how well is it supported.  Because these are so hard to quantify, the normal practice is to focus on the approaches that can be measured and look for at least a 3X improvement.

Consider this in the context of the physical verification market.  In the past few years, the market has been bombarded with a slew of new offerings.  Cadence introduced PVS.  Magma introduced Quartz, and now appears to be re-introducing it again.  Most recently Synopsys has introduced ICV:  In the meantime, Calibre continues to not only lead the market, but to take market share away from the competition.

This is largely due to the fact that Mentor continues to focus on and improve the calibre product.  For example, in preparation forthe 45nm process node, Mentor recognized the fact that increased rule complexity was becoming a problem both to code and on runtimes.  To address this they introduced functionalities such as TVF, a TCL based syntax that allows programming to simplify the rule file writing, hyper scaling, which dramatically reduced runtimes through scaling, and dynamic results visualization, which enabled users to debug results in parallel as the job continued to run.  Moving forward, in preparation for 32nm, Mentor recognized that historic syntaxes no longer were suitable to capture all failures.  As a result, equation based DRC (eqDRC) was introduced.  To address the growing amount of failures associated with ESD and other electrical failures, PERC, a circuit topology aware checker was also introduced.  Looking forward, we can expect 22nm to bring requirements for advanced pitch and grid checking, double patterning compliance checking, electrical and physical aware checking and more.  It is also likely that system in packaging will become more common, requiring yet more physical verification functionalities.  All of these and more are already in development and being rolled out at advanced customers with the calibre platform.

In the meantime, the competition seems to be playing catch-up.  The promised features surprisingly resemble the functionalities deployed years past by Calibre for 45nm.  Makes you wonder, how do you show that 5-10X value differenatiator or 3X performance advantage with a new offering against the industry adopted calibre, when calibre already has a head start?  I’m looking forward to potential renewed competition in this area, but something tells me it the alternatives will continue to fall short and, to quote the Nature Boy, calibre will continue to be the industry’s “kiss stealing, wheeling, dealing, limousine riding, leer jet flying, son of a gun — whoo!”

DRC, LVS, Physical Verification, Ric Flair

More Blog Posts

About John Ferguson

John FergusonJohn Ferguson has spent the past 13 years focused in the area of physical verification. As a lead technical marketing engineer, his time is dedicated on understanding new requirements and ensuring that the continued development of calibre is properly focused and prioritized. This includes a combination of understanding the requirements of the latest process nodes to ensure that all checks can be accurately coded and implemented in calibre, as well as ensuring that the use models and debugging information are presented in a manner that enables the most efficiency from users. Visit John Ferguson's Blog

More Posts by John Ferguson


No one has commented yet on this post. Be the first to comment below.

Add Your Comment

Please complete the following information to comment or sign in.

(Your email will not be published)


Online Chat