Sign In
Forgot Password?
Sign In | | Create Account

DFM for Non-PhD's: Part 3 - Real Life Examples

I got some questions from my last installment of this series asking for some pictures of defects that caused yield issues in production that could have been avoided during design. It struck me that most designers probably never get a chance to see the manufacturing problems their designs encounter. Since my background is in the fab, I wrongly assumed everyone had lived through the same pain as myself. It’s a great question so I decided to focus this installment on real life examples.

There are actually three basic type of DFM issues that a design can encounter  (Random, Systematic, Parametric). Random defects are defects that occur independent of the design layout, but the probability of the design failing because of them, is dependent on the layout. Here are some example of the defects I am talking about.


The image on the left is a composite wafer map showing the location of all the particle defects that occurred on this wafer during processing. By composite I mean the sum of the particles that occurred at various points during the manfacturing process and are located within various layers of the design. This map gives you a feel for the spatial distribution and occurrence rate of particles. For the  most part you can see that they are randomly distributed across the wafer independent of the repeating pattern of the design die. The exception are the defects that occur in the circular (spirograph like) patterns. These are scratches generated during the CMP process as the grinding pad rubbed some hard particle across the wafer. The pictures on the right are optical and SEM images of some of the defects. Depending on where and when the particles occur they can have three possible effects. They can cause an electrical short, an electrical open or they can have no effect at all if they land in open areas or are two small to create a complete short or open. This is how the design can effect the yield. It is clear from the wafer map that if every particle caused a short or open then none of the die on this wafer would yield. In reality, a design layout is relatively empty on a given layer. If you think about it as layout density then most parts of the layout on a given layer are less than 50% dense meaning half of the space is unused. Therefore only a percentage of the defects land on active circuitry and if they do only a percentage are big enough to cause a comple short or open. By utilizing the open space of the layout more effectively, a designer can limit the susceptability of the design to these particles. Critical Area Analysis (CAA) is the DFM tool used to assess the design sensitivity to random defects. By measuring and reducing the amount of critical area, design teams can improve their yield. An example of a design change effecting this sensitivity is shown below.


This comes from a paper I did with LSI at DesignCon this year. We used the Calibre YieldEnhancer tool to find opportunities for via doubling that their router missed on four different designs. We then ran Calibre YieldAnalyzer to assess the Critical Area impact of doubling the extra vias and the yield impact it would have in production. You can see that the design yields were increased by up to 2% by making these few incremental changes on top of what the router had already done on a process that was already running at mature yields. On a high volume product 2% could mean a lot of extra profit. Imagine the impact of a broad range of changes throughout the design flow.

The second type of DFM issue that a design can encounter is systematic defects. These are defects that only occur when a particular layout construct interacts with a particular process variation. Again, the problem is statistical, in that the process only exhibits the particular variation a small percentage of the time and only a narrow range of layout constructs are suceptible to the variation. Several examples are shown below.


In this first example an electrical short and and electrical open are shown that are caused by variation in the lithography process that interacted with these particular layout constructs. You can see that the the bulk of the patterns are produced without issue and the problem was very localized. These locations print perfectly fine at nominal litho dose and focus, but at one edge of the process variation these spots image improperly. These particular locations have a non-zero probability of having this occur but the probability is not 100%. Tools like Calibre Litho Friendly Design (LFD) are used to identify these types of litho sensitivities.


In this second example an electical short is shown that was caused by the interaction of the previous layers with the CMP process. You can see in the picture on the left that all the lower levels o metal were aligned with the same spacing and width. This caused a slight thickness variation on each layer that added up as each layer was polished. Then in the top layer the layout was different. and the severity of the depression had accumulated to the point that the CMP process did not clear all the copper in the depressed area leaving the slight amount of copper bridging the two wires. Again these particular locations have a non-zero probability of having this occur but the probability is not 100%. Tools like Calibre CMPAnalyzer (CMPA) are used to identify these types of litho sensitivities and tools like Calibre YieldEnhancer are used to do “smart” fill to correct them.


In this example an electrical open is shown which is caused by the migration of small voids (bubbles essentially) in the copper metal that move to a point of stress relief and accumulate to the point of creating a significantly large void to cause an open. This phenomenon occurs when large areas of copper are in proximity to a single via. The via tends to act as a point of stress relief. Again the probability of it occuring is non-zero but not 100%. As the graph on the right shows the probability varies dramatically with the change in the width of the wire in this particular test structure.


 In this example a non-problem becomes a problem in a very limited combination multiple layout dimensions. The dielectric deposition process that covers poly and active prior to cutting the local interconnect (LI) holes produces “keyholes” with certain gate spacings as shown in the picture on the right. Normally, these are no problem and do not effect anything about the circuit. However, when two LI cuts with small spacings between them are made between these gates as shown in the layout on the left an unexpected problem occurs. The keyhole acts as a tunnel between the two LI cuts and when the titanium liner is deposited in the cut, small amounts of Ti diffuse into the tunnel. If the LI cuts are close enough together then the tunnel is short enough for the diffused Ti from each side to touch causing a short as shown in the picture in the middle. Again the probability of it occuring is non-zero but not 100% and is highly dependent on both the gate and LI spacing simultaneously.


In this final example electrical shorts have an increased probability of occuring if min spaced metal wires of min width run at long distances beside each other. It is due to surface tension caused by evaporating water during develop rinse and dry. It is very feature dimension sensitive and has a non-zero but not 100% probability of occuring.

In the last three examples there is not a dedicated process simulator based DFM solution in the EDA industry for identifying these types of things. In these cases people are using Calibre YieldAnalyzer to create statistically based recommended rule analysis reports for these issues as they find them. We call this type of analysis Critical Feature Analysis (CFA). The idea is to take multi-dimensional measurements of the layout and relate them in a mathematical way to generate some level of empirical model of the probability or risk of these types of occurrences and then to roll up the statistical probability at the block or chip level. Armed with this information the designer can proritize the various features by sensitivity and drive down the overall statistical probability of failure. This in turn improves the yield. An example of this was demonstrated by Samsung in the Common Platform joint paper at SPIE this year shown below.


The table on the left shows the difference in the MCD score between the DFM enhanced and the nominal design. MCD is the Common Platform implementation of the Calibre YieldAnalyzer CFA solution. They ran the two versions of the layout on a test chip side by side. The table on the right shows that the DFM version yielded ~8% better than the non-optimized one. The MCD scores doesn’t predict the exact amount but there is a strong statistical correlation between the improvement of these DFM quality scores from CFA and the yield in production.

The last type of DFM issue that a design can encounter is parametric variability. This might not be accurately called “yield loss” as it depends on your product specifications. However, different layout configurations can experience much more variation than others in a way that doesn’t cause a short or open but causes a variation in some product performance measure. Again I will use a litho example.


In this example the L-Shaped piece of poly rounds off when printed on the wafer. Because the bend is so close to the active area edge it affects the gate length at the edge of this transistor. The difference will vary as the alignment and exposure vary during processing. By moving the bend farther away or reducing how far the bend runs parallel with the active edge the designer can reduce the variation he or she will see in production.  Recommended rules in general are layout guidelines that relate to statistical yield loss and parametric variability. In other words, they are rules that you don’t have to always follow but the statistically more you follow them the more of a reduction in statistical variability you will see in the product. The following are good examples.


The left example shows that changing the contact to gate spacing from the min design rule to the increased recommended rule reduces the Ioff leakage in the transistor by 35%. A 35% change in one transistor may not be critical but if a statistically significant number of transistor have room to make this change then it will have a statistically significant impact on the chip leakage. The second example shows a 10% change in resistivity of poly as the width varies from the min DRC rule to the RR. The third example indicates a significant change in the IDsat of a transistor as the gate spacing is changed from min DRC to RR. The bottom line is summed up well in the following data from ARM.


They show in this experiment 5 different implementations of the same cell. The graph shows how the performance of the cell varied with different implementations and the table shows a change in relative yield between the approaches. All of them passed DRC and LVS! Design does make a difference and using the DFM tools to guide you optimization will make a difference.

I hope these examples help you better understand the importance of investing in DFM tools, practices and methodologies.

Reliability, IC Verification, Design Quality, Design for Manufacturing, Scoring, Design Rules, IC Design, Physical Verification, Design Rule Checking

More Blog Posts

About David Abercrombie

David AbercrombieI am the Advanced Physical Verification Methodology Program Manager at Mentor Graphics in Wilsonville, Oregon. For the last five years at Mentor I have been driving the roadmap for developing EDA tools to solve the growing issues in design to process interactions (DFM) that are creating ever increasing yield problems in advanced semiconductor manufacturing. For the previous 15 years I drove yield enhancement programs in Semiconductor manufacturing at LSI Logic, Motorola, Harris and General Electric. I also led software development teams in delivering yield enhancement and data mining solutions to semiconductor manufacturing. I hope you will read my publications and patents on semiconductor processing, yield enhancement and EDA verification solutions. I received my BSEE from Clemson University in 1987 and my MSEE from North Carolina State University in 1988. I love to play the guitar, explore the great outdoors, and watch a great science fiction show. Visit David Abercrombie’s Blog

More Posts by David Abercrombie


No one has commented yet on this post. Be the first to comment below.

Add Your Comment

Please complete the following information to comment or sign in.

(Your email will not be published)


Online Chat