By Mark LaPedus, SemiMD senior editor
At a recent event, Ivo Bolsens, senior vice president and chief technology officer at Xilinx Inc., warned that the IC industry faces many challenges to make the giant leap into the 2.5D/3D chip era.
Design issues, technical hurdles and supply chain complexities are among the challenges, Bolsens said. “The industry still has a lot of work to do,” he said at the recent 3-D Architectures for Semiconductor Integration and Packaging conference in Burlingame, Calif.
Perhaps the most overlooked challenge is test. In fact, the industry is finally coming to grips with the challenges of testing the basic component in 2.5D/3D designs: bare die or known good die (KGD). Another related issue that is now rearing its ugly head is the ability to test the through-silicon-vias (TSVs) in designs.
The question is whether or not the current test technologies are ready to ensure KGD – and known-good TSVs – for the 2.5D/3D era? And can the industry keep test costs flat? The answer: Yes and no.
Besides the technical and economic issues, Xilinx and others are begging for 3D test standards. “We have to have a proper way of testing these things,” said Jae Cho, vice president of new product introduction engineering at Xilinx.
Many are asking themselves a simple question on the 3D test standards front: What is the industry waiting for?
3D devices make use of stacked bare die. In 3D designs, there is a much greater requirement to find failures in bare die to ensure KGD. A defective die will cause a failure in the entire 3D device, meaning the part must be discarded. This scenario in part also holds true for known-good TSVs.
KGD is critical. For example, an individual die has a yield of some 90 percent. Let’s say that die is constructed in a four-stack configuration. The part will have a total yield of only 66 percent, according to Cadence and the Global Semiconductor Association (GSA).
At the recent 3-D event, Joseph Sawicki, vice president and general manager of the Design-to-Silicon Division at Mentor Graphics Corp., listed several daunting test steps required to enable 2.5D/3D chips: KGD testing, known good interposer and pre-bond TSV testing, partial stack testing, memory-to-logic testing, and logic-to-logic TSV testing. In addition, he also outlined some of the solutions to enable KGD, including comprehensive wafer test coverage, the implementation of new and existing design-for-test (DFT) techniques, and, of course, standards.
Needless to say, chip makers face enormous challenges. One of the first 2.5D devices that has been shipped is Xilinx’ Virtex-7 2000T FPGA, a product based on a 28nm process and a silicon interposer technology. The 2000T is a homogenous part, in which four identical FPGA slices are stacked on a single 65nm silicon interposer. The device itself is built and assembled by TSMC.
For the 2000T, Xilinx conducts more testing at wafer sort than final test, Cho said. Xilinx does not test the actual TSVs. The interposer, equipped with the TSVs, is based on a mature 65nm process and “comes with a high yield to begin with,” he said. Because Xilinx conducts more testing at wafer sort with high test coverage — and less at final test — “we can keep the cost of test relatively flat,” he said.
The real challenge for Xilinx is the advent of heterogeneous 2.5D chips like the Virtex-7 HT. This device combines Xilinx’ 2000T as well as a 28.05-Gb/second transceiver chip from an undisclosed third party. Xilinx obtains a separate KGD from the third-party transceiver vendor.
In general, the root of the KGD problem is that a third party may be reluctant to share its IP and test data. “The problem in dealing with a third party is that we don’t know what the internal test coverage is,” Cho said. “The quality of die is high enough, but it is still a major issue.”
To ensure that Xilinx obtains KGD from a third party, the company works closely with its partner in the early stages of development to ensure “testability is included in the product,” he said.
But as the overall industry develops and ships more heterogeneous 2.5D/3D devices in the market, there are some major poblems. If a chip fails in the field, it’s still unclear who will take responsibility for the returns or who will conduct the failure analysis. One way to solve the problem is clear. “If there was a standard way to implement testability (in KGD), that would help,” he said.
DFT to the rescue?
Clearly, the DFT puzzle starts at design. For years, chip makers have used several DFT techniques — such as built-in-self-test (BIST), scan and others — to reduce test costs. BIST will be required in 3D memory and non-memory architectures.
For its 2.5D FPGA design, Xilinx uses what it calls a “broadcast mode” method for testing the 2000T, but the company did not elaborate. This, however, implies that Xilinx utilizes a mature DFT technique called boundary scan test. In boundary scan, the chip itself incorporates a series of internal latch cells. The cells are interconnected to form an independent scan path or register shift. Through an industry defined test access port and control function, the scan path enables embedded test within the device.
In broadcast mode, the device is tested by broadcasting the same input data to multiple scan chains. To reduce data volumes and test costs, this mode also makes use of an automatic test pattern generation (ATPG) compression technology.
Known as the IEEE 1149.1 standard, boundary scan is one of the keys in the 2.5D/3D test puzzle. Boundary scan is typically used to test the bottom die in a stacked design, according to one DFT strategy envisioned by Mentor Graphics. Two other standards, IEEE 1500 and IEEE 1687, can be used to test the middle die, according to Mentor. The IEEE 1500 enables test reuse and integration for embedded cores. In simple terms, IEEE 1687 provides a description of how to connect the on-chip instruments in a design.
According to Mentor, the three standards can be used in a “mix and match” format to test a 2.5D/3D device. Now, the real trick is the ability to develop a standard to support heterogeneous die from multiple vendors.
To overcome this problem, the so-called 3D Test Working Group is hammering out a proposed standard called IEEE 1838. The proposed standard hopes to define the architecture and description language for the “test access” architecture within a 3D device. Now, the group is looking at two test access technologies: IEEE 1149.1 and IEEE 1500. It is unlikely that the group will endorse both technologies. (See below for proposed architectures.)
But what about KGD? KGD will require comprehensive wafer test coverage, said Steve Pateras, product marketing director for Silicon Test Systems at Mentor Graphics. As part of that coverage, the development of 3D devices will “make use of more advanced fault models,” Pateras said.
Testing generates ATGPs based on multiple fault models, which emulate manufacturing defects. Common fault models include stuck-at, bridging and transition. EDA vendors typically implement these models for a customer, but the process can take time. Mentor and others have developed user defined fault models (UDFM), such as cell-aware. UDFM enables designers to create fault models without waiting for tool support.
Sorting out the problem
Once the chip has been designed, it moves into the chip-packaging and test phase. The cost of test is 15 percent of the overall assembly process, said Ram Praturu, senior director of test product and technology marketing at STATS ChipPAC Ltd., a chip-packaging house. “Now, with the complexity (of new chip designs), that could become higher,” Praturu said.
In the engineering phase alone, it can take from 6 months to a year to set up the test flow and ensure the yield for a 2.5D/3D device, he said. In that phase, chip makers may put a device through multiple test steps — such as wafer sort, one or more (and expensive) partial assembly test steps and a final/systems-level test — to ensure a die meets spec.
In the production stage, the final goal is to reduce the test flow down to two steps — wafer probe and final test — to keep costs down. In a typical flow, a wafer is processed in a fab. Then, the wafer is placed on a prober for wafer-level test. The prober is incorporated with a custom probe card, which itself has thousands of probing needles that hit the bond pads on each bare die on the wafer. Using electrical stimuli, the prober detects defective die, which are eliminated.
“The more stringent test is done at final test (for today’s 2D designs). Wafer sort was viewed as an initial screening process. That breaks down in 3D. There is a need to migrate high quality test from final test to wafer sort,” Mentor’s Pateras said.
With KGD in 2.5D/3D designs, some 90 percent to 95 percent — or even 99 percent — will be tested at wafer sort, STATS’ Praturu said. “Is KGD (testing) robust enough today? It’s evolving based on the application,” Praturu said. But the real challenge is that “everyone is putting more I/Os on the same die. As applications become more complex, it takes more or longer time” to test a given bare die or part.
Mike Slessor, president and chief executive of MicroProbe Inc., a probe card maker, said: “Having 100 percent test coverage is not a routine matter. KGD would be nice to have, but what are the related risks and costs? Cost will be the determining factor in wafer probe and one’s test strategy.”
Another challenge for the industry is TSV testing or sometimes called pre-bond TSV test. Testing the microbumps in 3D designs pose a similar problem. In one example, the original wafer could have a thickness of 700 micron. To expose the TSVs, the wafer must be thinned to pitches of 50 micron or less. Thinning the wafer could sometimes cause damage to the device.
TSV and microbump pitches are generally too tiny for today’s probe cards. Several solutions have been proposed for microbump and TSV pre-bond testing: scan, new probe card technologies, contactless wafer probe and others.
IMEC and Cascade Microtech are working on “rocking beam interposer” (RBI) probe technology, which is said to handle around 35 micron pitches. Longer term, the industry is looking at probe cards based on smaller, lithography-defined tip structures, said MicroProbe’s Slessor.
In another possible solution, STMicroelectronics Inc. last year announced so-called contactless wafer probe. Contactless testing reduces the cycle time because it enables high testing parallelism to be achieved in contactless mode. It increases yield by eliminating the pad damage that occasionally occurs during standard contact probe testing.
The technology is called electromagnetic wafer sort (EMWS). In EMWS, the individual die contain a tiny antenna. The prober supplies power and communicates with the die via electromagnetic waves. The first application appears to be contactless wafer probe of radio-frequency ID chips. The technology could expand into other fronts like TSVs.
Despite the new technologies on the horizon, Slessor said it could be a moot point if the industry cannot address a key issue in 3D chip testing. “Cost is going to become very important,” he said. The success or failure in the 2.5D/3D chip era “depends primarily on the economics of testing versus yields versus failed die.”
Scan proposal for 1838 standard (Source: IMEC)
Embedded core option for 1838 (Source: IMEC)