Sign In
Forgot Password?
Sign In | | Create Account

David's DAC09 - Another Special Guest

Well, day two of DAC started a little earlier than the first day. I had to attend the speakers breakfast for the paper I was going to give later that day. However, after breakfast I had my 9am suite presentation on eqDRC again and I also had a special guest again. This time it was Robert Boone from Freescale in Austin, TX. He works in the DFM team and he also agreed to come tell everyone what he and freescale had been doing with eqDRC.

What was fun for me is that Robert’s talk was much different than the one Jim Culp from IBM had given the day before. Jim’s was all about power analysis and Robert’s was all about applications to recommended rule based DFM. Here is a mug shot of Robert giving his presentation. Sorry about the quality Robert:)

img00063

Robert first showed an example of how Freescale had started using eqDRC for regular DRC applications, but he quickly moved to his two primary applications which were DFM scoring and DFM improvability. Freescale has been working with ST Microelectronics in a joint venture for Automotive Design in which they developed these flows. The slide below shows an overview of what they do for scoring recommended rules.

dfm-scoring-1

In their design rule manual they show not only the design rule limit but also three levels of DFM limits (L1, L2, L3). They know that the impact of recommended rule violations varies dramatically by dimension as shown in the colored curve in the bottom left. They use eqDRC capabilities to grade each violation on a continuous scale. As seen in the next slide the severity levels help give the designers target (bin) thresholds for improvement.

dfm-scoring-2

Each bin is weighted on this non-linear scale as shown in the table on the right. Notice the weights vary by both dimension and from high critical rule to low critical rule. This matrix helps make tradeoffs between violations and rules. Robert stated that having the ability to create these scores has really helped them understand, measure and track their designs and design methodology. Over time they have been able to track the improvement in their scores as they implement new tools, procedures and practices.

One of those new practices was “Improvability”. This is a metric they have worked on creating with the calibre infrastructure (including eqDRC) to understand how much low hanging fruit is available in a design that could easily be fixed by the designer. It is kind of like the “stop you whining and fix the simple stuff” kind of metric. The next slide shows what they mean.

dfm-improvability-1

They use Calibre to find places where simple local improvements can be made without impacting other rules or design area. If the improvement is sufficient to move a violation along the score curve from one bin to another, then they consider it improtant enough to fix. To deal with the tradeoffs between various recommended and design rules they use the following system.

dfm-improvability-2

The previous table is expanded to compare the improvement against any detriment it may cause in another rule. This is where eqDRC becomes so helpful. Determining if the fix is the right thing to do requires a mathematical analysis of the options. This also helps alleviate the potential oscillation between fixes and new violations. The example below really helps show how they apply all this.

dfm-improvability-example

In this rule that encourages widening field poly where possible you can see in the table that two other recommended rules and four other design rules are analyzed in determining which ones are improvable. In the picture you can see the poly is in Green the active is in red the violations are in cyan and the improvable edges are in magenta. Freescale uses these scoring and improvability decks in many ways as shown in the following slide.

dfm-deck-uses

The first two uses are in optimizing the router techLEF setups and the pCell generators for auto cell migration. The run experiments with different settings and see which one produces better scores and leave fewer improvable locations. They then use the decks to drive manual optimization on top of the automated optimization. As the table shows, they identified that 10 cells out of the 600+ library accounted for 75% of the yield loss because their utilization was so high and their scores were so bad. By focusing the design efforts on these 10 cells they show in the table that they can get almost 1% yield improvement on a large design from defect limited modes alone. Remember, this analysis was done on a mature process that had already been optimized by the automated tools. This is additional yield they squeezed out for volume production margin improvement. It also doesn’t account for improvements in parametric variability and yield. I like the last reason on the slide as well; You just learn stuff when you have a tool that can measure your quality and give you useful feedback.

It was great having Robert present. Both he and Jim will be giving webex seminars later (August or September time frame) on this material. So if you missed it you can hear it from them at those sessions. Keep a look out on the Mentor website and your email for the announcements. Much of the work that Freescale and ST have done has also been documented in two User2User papers that they wrote. Here are links to them if you are interested in more detail in the short term.

http://supportnet.mentor.com/member/u2u/2007/papers/PatrickLeMaitre_NXP_paper.pdf

http://supportnet.mentor.com/member/u2u/2008/papers/melchiori_stmicroelectronics_paper.pdf

I hope all of you are having as much success with DFM!

Reliability, IC Verification, Physical Verification, Design for Manufacturing, DAC, DRC, IC Design, Improvability, Design Rule Checking, Design Rules

More Blog Posts

About David Abercrombie

David AbercrombieI am the Advanced Physical Verification Methodology Program Manager at Mentor Graphics in Wilsonville, Oregon. For the last five years at Mentor I have been driving the roadmap for developing EDA tools to solve the growing issues in design to process interactions (DFM) that are creating ever increasing yield problems in advanced semiconductor manufacturing. For the previous 15 years I drove yield enhancement programs in Semiconductor manufacturing at LSI Logic, Motorola, Harris and General Electric. I also led software development teams in delivering yield enhancement and data mining solutions to semiconductor manufacturing. I hope you will read my publications and patents on semiconductor processing, yield enhancement and EDA verification solutions. I received my BSEE from Clemson University in 1987 and my MSEE from North Carolina State University in 1988. I love to play the guitar, explore the great outdoors, and watch a great science fiction show. Visit David Abercrombie’s Blog

More Posts by David Abercrombie

Comments

No one has commented yet on this post. Be the first to comment below.

Add Your Comment

Please complete the following information to comment or sign in.

(Your email will not be published)

Archives

 
Online Chat