Part of the  

Solid State Technology

  and   

The Confab

  Network

About  |  Contact

Posts Tagged ‘variability’

Edge Placement Error Control in Multi-Patterning

Thursday, March 2nd, 2017

thumbnail

By Ed Korczynski, Sr. Technical Editor

SPIE Advanced Lithography remains the technical conference where the leading edge of minimum resolution patterning is explored, even though photolithography is now only part of the story. Leading OEMs continue to impress the industry with more productive ArFi steppers, but the photoresist suppliers and the purveyors of vacuum deposition and etch tools now provide most of the new value-add. Tri-layer-resist (TLR) stacks, specialty hard-masks and anti-reflective coatings (ARC), and complex thin-film depositions and etches all combine to create application-specific lithography solutions tuned to each critical mask.

Multi-patterning using complementary lithography—using argon-fluoride immersion (ArFi) steppers to pattern 1D line arrays plus extreme ultra-violet (EUV) tools to do line cuts—is under development at all leading edge fabs today. Figure 1 shows that edge placement error (EPE) in lines, cut layers, and vias/contacts between two orthogonal patterned layers can result in shorts and opens. Consequently, EPE control is critical for yield within any multi-patterning process flow, including litho-etch-litho-etch (LELE), self-aligned double-patterning (SADP) and self-aligned quadruple-patterning (SAQP).

Fig.1: Plan view schematic of 10nm half-pitch vertical lines overlaid with lower horizontal lines, showing the potential for edge-placement error (EPE). (Source: Y. Borodovsky, SPIE)

Happening the day before the official start of SPIE-AL, Nikon’s LithoVision event featured a talk by Intel Fellow and director of lithography hardware solutions Mark Phillips on the big picture of how the industry may continue to pattern smaller IC device features. Regarding the timing of Intel’s planned use of EUV litho technology, Phillips re-iterated that, “It’s highly desirable for the 7nm node, but we’ll only use it when it’s ready. However, EUVL will remain expensive even at full productivity, so 193i and multi-patterning will continue to be used. In particular we’ll need continued improvement in the 193i tools to meet overlay.”

Yuichi Shibazaki— Nikon Fellow and the main architect of the current generation of Nikon steppers—explained that the current generation of 193i steppers, featuring throughputs of >200 wafers per hour, have already been optimized to the point of diminishing returns. “In order to improve a small amount of performance it requires a lot of expense. So just improving tool performance may not decrease chip costs.” Nikon’s latest productivity offering is a converted alignment station as a stand-alone tool, intended to measure every product wafer before lithography to allow for feed-forward tuning of any stepper; cost and cost-of-ownership may be disclosed after the first beta-site tool reaches a customer by the end of this year.

“The 193 immersion technology continues to make steady progress, but there are not as many new game-changing developments,” confided Michael Lercel, Director of Strategic Marketing for ASML in an exclusive interview with SemiMD. “A major theme of several SPIE papers is on EPE, which traditionally we looked at as dependent upon CD and overlay. Now we’re looking at EPE in patterning more holistically, with need to control the complexity with different error-variables. The more information we can get the more we can control.”

At LithoVision this year, John Sturtevant—SPIE Fellow, and director of RET product development in the Design to Silicon Division at Mentor Graphics—discussed the challenges of controlling variability in multi-layer patterning. “A key challenge is predicting and then mitigating total EPE control,” reminded Sturtevant. “We’ve always paid attention to it, but the budgets that are available today are smaller than ever. Edge-placement is very important ” At the leading edge, there are multiple steps within the basic litho flow that induce proximity/local-neighbor effects which must be accounted for in EDA:  mask making, photoresist exposure, post-exposure bake (PEB), pattern development, and CD-SEM inspection (wherein there is non-zero resist shrinkage).

Due to the inherent physics of EUV lithography, as well as the atomic-scale non-uniformities in the reflective mirrors focusing onto the wafer, EUV exposure tools show significant variation in exposure uniformities. “For any given slit position there can be significant differences between tools. In practice we have used a single model of OPC for all slit locations in all scanners in the fab, and that paradigm may have to change,” said Sturtevant. “It’s possible that because the variation across the scanner is as much as the variation across the slit, it could mean we’ll need scanner-specific cross-slit computational lithography.” More than 3nm variation has been seen across 4 EUVL steppers, and the possible need for tool-specific optical proximity correction (OPC) and source-mask optimization (SMO) would be horrible for managing masks in HVM.

Thin Films Extend Patterning Resolution

Applied Materials has led the industry in thin-film depositions and etches for decades, and the company’s production proven processing platforms are being used more and more to extend the resolution of lithography. For SADP and SAQP MP, there are tunable unit-processes established for sidewall-spacer depositions, and chemical downstream etching chambers for mandrel pull with extreme material selectivity. CVD of dielectric and metallic hard-masks when combined with highly anisotropic plasma etching allows for device-specific and mask-specific pattern transfers that can reduce the line width/edge roughness (LWR/LER) originally present in the photoresist. Figure 2 from the SPIE-AL presentation “Impact of Materials Engineering on Edge Placement Error” by Regina Freed, Ying Zhang, and Uday Mitra of Applied Materials, shows LER reduction from 3.4 to 1.3 nm is possible after etch. The company’s Sym3 chamber features very high gas conductance to prevent etch byproducts from dissociation and re-deposition on resist sidewalls.

Fig.2: 3D schematics (top) and plan view SEM images (bottom) showing that control of plasma parameters can tune the byproducts of etch processes to significantly reduce the line-width roughness (LWR) of minimally scaled lines. (Source: Applied Materials)

TEL’s new SAQP spacer-on-spacer process builds on the work shown last year, using oxide as first spacer and TiO2 as second spacer. Now TEL is exploring silicon as the mandrel, then silicon-nitride as the first spacer, and titanium-oxide as second spacer. This new flow can be tuned so that all-dry etch in a single plasma etch chamber can be used for the final mandrel pull and pattern transfer steps.

Coventor’s 3D modeling software allows companies to do process integration experiments in virtual space, allowing for estimation of yield-losses in pattern transfer due to variations in side-wall profiles and LER. A simulation of 9 SRAM cells with 54 transistors shows that photoresist sidewall taper angle determines both the size and the variability of the final fins. The final capacitance of low-k dielectric in dual-damascene copper metal interconnects can be simulated as a function of the initial photoresist profile in a SAQP flow.

—E.K.

Controlling Variabilities When Integrating IC Fab Materials

Friday, April 15th, 2016

thumbnail

By Ed Korczynski, Senior Technical Editor, SemiMD/Solid State Technology

Semiconductor integrated circuit (IC) manufacturing has always relied upon the supply of critical materials from a global supply chain. Now that shrinks of IC feature sizes have begun to reach economic limits, future functionality improvements in ICs are increasingly derived from the use of new materials. The Critical Materials Conference 2016—to be held May 5-6 in Hillsboro, Oregon (cmcfabs.org)—will explore best practices in the integration of novel materials into manufacturing. Dr. David Thompson, Senior Director, Center of Excellence in Chemistry, Applied Materials will present on “Agony in New Material Introductions – minimizing and correlating variabilities,” which he was willing to discuss in advance with SemiMD.

Korczynski: With more and more materials being considered for use in high-volume manufacturing (HVM) of advanced ICs, how do you begin to selectively screen out materials that will not work for one reason or another to be able to reach the best new material for a target application?

Thompson: While there’s ‘no one size fits all’ solution to this, it typically starts with a review of what’s available and known about the current offerings. With respect to the talk at the CMC, we’ll review the challenges we run into after the materials system and chemistries are set and have been proven generally viable, but still require significant optimization in order to get acceptable yields for manufacturing. It’s a very long road from device proof of concept on a new materials system to a viable manufacturing process.

Korczynski: Since new materials are being considered for use on the atomic-scale in advanced devices, doesn’t all of this have to be done with control at the atomic scale?

Thompson: For the material on the chip, many mainstream analytical techniques are used to achieve atomic level control including TEMs and AFMs with atomic resolution during film development for many applications. Unfortunately, this resolution is not available for the chemicals we’re relying on to deposit these materials. For a typical precursor that weighs in the 200 Dalton range, a gram of precursor may have 5 × 1020 molecules. That’s a lot of molecules. Even with ppb (109) resolutions on analytical, you’re still dealing with invisible populations of >1010 molecules. It gets worse. While trace metals analysis can hit ppb levels, molecular analysis techniques are typically limited in the 0.1 to 0.01 percent resolutions for most semiconductor precursors and there may be impurities which are invisible to routine analytical techniques.

Ultimately, we rely on analytical techniques to control the gross parameters and disciplined process controls to verify suppliers produce the same compositions the same way, and to manage impurities. On the process and hardware side, it’s like threading the needle trying to get the right film at the right throughput, in a process space that’s as tolerant as possible to the inevitable variability in these chemistries.

Korczynski: With all of this investment in developing one specialty material supplier for advanced IC manufacturing, what is the cost to develop and qualify a second source?

Thompson: Generally, it’s not sustainable to release a product with dual specialty material sources. The problem with dual-sourcing is chemical suppliers protect their knowledge—not simple IP—but also their sub-supply-chains and proprietary methods of production, transport and delivery. However, given how trace elements in the formulation can change depending on conditions the molecules experience over time, the customer in many cases needs to develop two separate sub-recipes based on the specific vendor’s chemistry they are using. So, redundancy in the supply chain is prudent as is making sure the vendor can produce the material in different locations.

There are countless examples over the last 20 years of what I like to call ‘the agony of the supply-chain’ when a process got locked into using a material when the only supply was from a Ph.D. chemist making it in small batches in a lab. In most cases the initial batch of any new molecule is made at a scale that would fit in a coffee mug. Sometimes though scaling up the first industrial-scale batch can alter impurity factors that change yields on the wafer even with improved purification. So while a customer would like to keep using a small batch production, it’s not sustainable but trying to qualify a second vendor in this environment presents significant challenges.

Korczynski: Can you share an example with us of how your team brought a source of subtle variation under control?

Thompson: We had a process using a new metal film, and in the early development everything looked great. Eventually we observed a drift of process results that was more pronounced with some ampoules and less so with others. The root cause initially eluded us. Then, a bright Ph.D. on our team said it’s interesting that the supplier did not report a particular contaminant that would tend to be present as a byproduct of the reaction. The supplier confirmed it was present and variable at concentrations in the 100-300 ppm concentration in the blend. This contaminant was relatively more volatile than the main component due to vapor pressure differences and much more reactive with the substrate/wafer. It was found this variability in the chemistry induced the process variation on the wafer (as shown in Figure 1).

FIGURE 1. RESOLUTION OF SEQUENTIAL WAFER DRIFT VIA IMPURITY MANAGEMENT

Chasing impurities and understanding their impact requires rigor and a lot of data collection. There’s no Star Trek analyzer we can use to give us knowledge of all impurities present and the role of those impurities on the process. Many impurities are invisible to routine analytical techniques, so we work very closely with vendors to establish a chemistry analytical protocol for each precursor that may consist of 5-10 different techniques. For the impurities we can’t detect we rely on excellent manufacturing process control and sub-supply sourcing management.

Korczynski: Is the supply-chain for advanced precursors for deposition and etch supplying everything we need in early R&D?

Thompson: New precursor ideation—the science that leads to new classes of compounds with new reactivity that Roy Gordon, or more recently Chuck Winter, have  been doing in academia is critically important and while there are a few academics doing excellent work in this space, in general there’s not enough focus on this topic.While we see many IP protected molecules, too often they are obvious simple modifications to one skilled in the art, consisting of merely adding a functional group off of a ring, or mixing and matching known ligand systems. We don’t see a lot of disruptive chemistries. The industry is hunting for differentiated reactivity, and evolutionary precursor development approaches generally aren’t sufficiently disruptive. While this research is useful in terms of tuning a vapor pressure or thermal stability it only very rarely produces a differentiated reactivity.

Korczynski: Do we need new methodologies to more efficiently manage all of this?

Thompson: Applied has made significant investments over the last 5 years to help accelerate the readiness of new materials across the board. One of the best things about working at Applied is the rate at which we can learn and build an ecosystem around a new material. With our strength in chemistry, deposition, CMP, etch, metrology and a host of other technologies, we get a fast, strong feedback loop going to accelerate issue discovery, resolution and general learning around new materials.

On the chemical supply-chain front, the need is making sure that chemical vendors accelerate their analytical chemistry development on new materials. Correlating the variability of chemistry to process results and ultimately yield is the real battle. The more knowledge we have of a chemistry moving into development, the faster learning can occur. I explain to my team that we can’t be proactive and respond to things we didn’t anticipate. Situations where trying to develop the analytical technique to see the impurity responsible for causing (or resolving) a variability is to start out at a significant disadvantage. However, we’ve seen a good response from suppliers on new materials and significant improvement on the early learnings necessary to minimize the agony of new material introductions.

A Call To Action: How 20nm Will Change IC Design

Thursday, February 21st, 2013

The 20nm process node represents a turning point for the electronics industry. While it brings tremendous power, performance and area advantages, it also comes with new challenges in such areas as lithography, variability, and complexity. The good news is that these become manageable challenges with 20nm-aware EDA tools when they are used within end-to-end, integrated design flows based on a “prevent, analyze, and optimize” methodology.

To download this white paper, click here.