To be the man, you've gotta beat the man!
To be the man, you've gotta beat the man!
I admit it. I’m one of those rare adults who is actually willing to admit that he (or she) is a true fan of professional wrestling; have been since I was a kid in the early 70’s. Over the entire course of those years, probably the greatest athlete in professional wrestling has been the recently retired “Nature Boy” Ric Flair. A multi-time world champ, one of Ric’s most prominent quotes has been “to be the man, you’ve gotta beat the man — whoo!” In the world of professional wrestling, this refers to the fact that the champ only loses his title if his opponent can cleanly beat him, by pin fall or submission, in the middle of the ring. The champ can lose by count out or disqualification, but in such situations does not lose the title. If you want to be the champ, you have to clearly demonstrate superiority.
Lately, I’ve been thinking about this comment as it relates to the world of EDA and for me personally to the world of physical verification. In many ways, the same concept holds. To displace an EDA tool at a company, you have to show significant value versus the incumbant solution. This is because any new solution will have a significant cost associated with it. This cost is not only the potential software cost, but also includes costs to train the users, who are used to a previous alternative. It may also require costs to different hardware or OS platforms. The general rule of thumb in EDA is in order to show enough value to switch to a new tool, you need to show 5-10X the value of the previous tool.
But, this is a daunting task. How do you measure and demonstrate value? Some things are relatively easy, like how fast does it take a tool to complete a task, how much hardware (CPUs, memory and disk) does it require, … Others are more tangible like how easy is it to use, how stable is it, how well is it supported. Because these are so hard to quantify, the normal practice is to focus on the approaches that can be measured and look for at least a 3X improvement.
Consider this in the context of the physical verification market. In the past few years, the market has been bombarded with a slew of new offerings. Cadence introduced PVS. Magma introduced Quartz, and now appears to be re-introducing it again. Most recently Synopsys has introduced ICV: http://finance.yahoo.com/news/Synopsys-Launches-IC-prnews-15198667.html?.v=1. In the meantime, Calibre continues to not only lead the market, but to take market share away from the competition.
This is largely due to the fact that Mentor continues to focus on and improve the calibre product. For example, in preparation forthe 45nm process node, Mentor recognized the fact that increased rule complexity was becoming a problem both to code and on runtimes. To address this they introduced functionalities such as TVF, a TCL based syntax that allows programming to simplify the rule file writing, hyper scaling, which dramatically reduced runtimes through scaling, and dynamic results visualization, which enabled users to debug results in parallel as the job continued to run. Moving forward, in preparation for 32nm, Mentor recognized that historic syntaxes no longer were suitable to capture all failures. As a result, equation based DRC (eqDRC) was introduced. To address the growing amount of failures associated with ESD and other electrical failures, PERC, a circuit topology aware checker was also introduced. Looking forward, we can expect 22nm to bring requirements for advanced pitch and grid checking, double patterning compliance checking, electrical and physical aware checking and more. It is also likely that system in packaging will become more common, requiring yet more physical verification functionalities. All of these and more are already in development and being rolled out at advanced customers with the calibre platform.
In the meantime, the competition seems to be playing catch-up. The promised features surprisingly resemble the functionalities deployed years past by Calibre for 45nm. Makes you wonder, how do you show that 5-10X value differenatiator or 3X performance advantage with a new offering against the industry adopted calibre, when calibre already has a head start? I’m looking forward to potential renewed competition in this area, but something tells me it the alternatives will continue to fall short and, to quote the Nature Boy, calibre will continue to be the industry’s “kiss stealing, wheeling, dealing, limousine riding, leer jet flying, son of a gun — whoo!”
More Blog Posts
- Battle of Fins and BOXes
- TSMC 28nm yield (SemiWiki)
- DAC 2011 is upon us!
- Mentor Graphics User to User (U2U)
- Gate Oxide Breakdown Failures Highlight Industry Need for New Electrical Rule Checking Tools
- Dawn at the OASIS
- Layout Density and the Analog Cell
- Effects of Inception
- On-line session covering the DAC presentation for Calibre xACT 3D
- You can't give stuff away fast enough
- December, 2012
- March, 2012
- May, 2011
- April, 2011
- February, 2011
- January, 2011
- November, 2010
- August, 2010
- June, 2010
- May, 2010
- April, 2010
- March, 2010
- February, 2010
- January, 2010
- December, 2009
- November, 2009
- October, 2009
- September, 2009
- August, 2009
- July, 2009
- June, 2009
- "Waive" of the Future?
- How do you debug LVS?
- DFM for Non-PhD's: Part 2 - Reliability
- Mixed-Signal SoC Verification
- Process Variation: The Use of In-Die Variation
- DFM for Non-PhDs
- Calibre Everywhere -- the customer value of universal integration
- So, why not just write better rules?
- To be the man, you've gotta beat the man!
- Power in need, Power indeed
- May, 2009