Part of the  

Solid State Technology

  and   

The Confab

  Network

About  |  Contact

Posts Tagged ‘ARM’

Mentor Graphics Signs Agreement with ARM to Accelerate Early Hardware/Software Development

Wednesday, November 16th, 2016

Mentor Graphics Corporation (NASDAQ: MENT) has signed a multiyear license agreement with ARM to gain early access to a broad range of ARM Fast ModelsCycle Models and related technologies. Mentor will have access to all ARM Fast Models for the ARMv7 and ARMv8 architectures across all ARM Cortex-A, Cortex-R, Cortex-M cores, GPUs and System IP, in addition to engineering collaboration on further optimizations. This builds on agreements already in place to ensure that the validation of ARM models is completed ahead of mutual customer demand.

“Our collaboration with Mentor has resulted in one of ARM’s broadest modeling partnerships,” said Javier Orensanz, general manager, development solutions group, ARM. “With this agreement, our mutual customers can utilize ARM’s entire model portfolio to speed system execution and debug issues with complete accuracy.”

As a result of this agreement, ARM Fast Models can be combined with the Veloce emulation platform, for example, to enable faster verification and earlier software development. Moving the modeling of the CPU and GPU out of the emulator and into the ARM Fast Models allows software execution performance orders of magnitude faster than a traditional approach that relies on a complete RTL description to be ready. This enables software tasks to be executed quickly, such as Android boots and application execution. Verification teams can now validate more than just boot code and drivers. They can also run complete software stacks to exercise the system in a realistic manner and flush out hard-to-find bugs, which would otherwise have gone undetected until physical prototypes were available.

“This second agreement with ARM clearly indicates our strategic alignment toward providing a complete HW/SW development platform,” said Brian Derrick, vice president of marketing, Mentor Graphics. “Our mutual customers benefit from early access and validation of state-of-the-art Mentor technology working with the most current ARM models.”

Mentor Graphics U2U Meeting April 26 in Santa Clara

Monday, April 11th, 2016

thumbnail

Mentor Graphics’ User2User meeting will be held in Santa Clara on April 26, 2016. The meeting is a highly interactive, in-depth technical conference focused on real world experiences using Mentor tools to design leading-edge products.

Admission and parking for User2User is free and includes all technical sessions, lunch and a networking reception at the end of the day. Interested parties can register on-line in advance.

Wally Rhines, Chairman and CEO of Mentor Graphics, will kick things off at 9:00am with a keynote talk on “Merger Mania.“ Wally notes that in 2015, the transaction value of semiconductor mergers was at an all-time historic high.  What is much more remarkable is that the average size of the merging companies is five times as large as in the past five years, he said. This major change in the structure of the semiconductor industry suggests that there will be changes that affect everything from how we define and design products to how efficiently we develop and manufacture them. Dr. Rhines will examine the data and provide conclusions and predictions.

He will be followed by another keynote talk at 10:00 by Zach Shelby, VP of Marketing for the Internet of Things at ARM. Zach was co-founder of Sensinode, where he was CEO, CTO and Chief Nerd for the ground-breaking company before its acquisition by ARM. Before starting Sensinode, Zach led wireless networking research at the Centre for Wireless Communications and at the Technical Research Center of Finland.

After user sessions and lunch, a panel will convene at 1:00pm to address the topic “Ripple or Tidal Wave: What’s driving the next wave of innovation and semiconductor growth?” Technology innovation was once fueled by the personal computer, communications, and mobile devices. Large capital investment and startup funding was rewarded with market growth and increased silicon shipments. Things are certainly consolidating, perhaps slowing down in the semiconductor market, so what’s going to drive the next wave of growth?  What types of designs will be staffed and funded? Is it IoT?  Wearables?  Automotive?  Experts will address these and other questions and examine what is driving growth and what innovation is yet to come.

Attendees can pick from nine technical tracks focused on AMS Verification, Calibre I and II, Emulation, Functional Verification, High Speed, IC Digital Implementation, PCB Flow, and Silicon Test & Yield Solutions. You’ll hear cases studies directly from users and also updates from Mentor Graphics experts.

These user sessions will be held at 11:10-12:00am, 2:00-2:50pm and 3:10-5:00pm.

A few of the highlights:

  • Oracle’s use of advanced fill techniques for improving manufacturing yield
  • How Xilinx built a custom ESD verification methodology on the Calibre platform
  • Qualcomm used emulation for better RTL design exploration for power, leading to more accurate power analysis and sign-off at the gate level
  • Micron’s experience with emulation, a full environment for debug of SSD controller designs, plus future plans for emulation
  • Microsoft use of portable stimulus to increase productivity, automate the creation of high-quality stimulus, and increase design quality
  • Formal verification at MicroSemi to create a rigorous, pre-code check-in review process that prevents bugs from infecting the master RTL
  • A methodology for modeling, simulation of highly integrated multi-die package designs at SanDisk
  • How Samsung and nVidia use new Automatic RTL Floorplanning capabilities on their advanced SoC designs
  • Structure test at AMD: traditional ATPG and Cell-Aware ATPG flows, as well as verification flows and enhancements

Other users presenting include experts from Towerjazz, Broadcom, GLOBALFOUNDRIES, Silicon Creations, MaxLinear, Silicon Labs, Marvell, HiSilicon, Qualcomm, Soft Machines, Agilent, Samtec, Honewell, ST Microelectronics, SHLC, ViaSat, Optimum, NXP, ON Semiconductor and MCD.

The day winds up with a closing session and networking reception from 5:00-6:00pm.

Registration is from 8:00-9:00am in the morning.

Securing Medical Devices Is Topic of Conference Keynote

Thursday, December 3rd, 2015

thumbnail

By Jeff Dorsch, Contributing Editor

The Designers of Things conference and exposition in San Jose, Calif., kicked off Wednesday morning (December 2) with a keynote address by Jay Radcliffe of Rapid7.

He looked at medical devices in particular, and how these Internet-connected devices could be secured.

As a cybersecurity researcher, Radcliffe has procured hundreds of medical devices by various means to examine them for security vulnerabilities.

These issues are personal for him, since he has Type 1 diabetes. “Insulin pumps help people live a better life,” he said. “They help parents live a happier life.”

Headlines about online theft of credit-card information and identity theft point to the insecurity of sensitive information, Radcliffe noted. “We can’t secure a simple five-dollar transaction,” he commented. The vulnerability of medical devices is more critical. “You could lose someone’s life,” he said.

“More malware is coming out on Android devices,” Radcliffe said. At the same time, some companies are turning to the Android mobile operating system as “the foundation for medical devices saving people’s lives,” he added.

Development of medical devices for the Internet of Things worries the veteran security researcher. “My fear – we’re going too fast,” Radcliffe said. “How do we put Bluetooth in our medical devices safely?”

Too often, people in positions of authority say cybersecurity is “too complicated,” Radcliffe said. Not enough cybersecurity specialists are available, they insist.

The answer? “We need a Spartan,” referring to the warriors of ancient Sparta, he said. “They defended.”

Radcliffe added, “You need a security researcher, a specialist.” Such personnel are available as consultants, he noted. There are ISO standards on vulnerability management.

Consultants and security partners should be aware of “all aspects of your business,” he advised.

Radcliffe concluded, “Keep security in the loop for the lifecycle of your device.”

The Rapid7 consultant was followed by Ryan Cousins, chief executive officer of krtkl, creator of the Snickerdoodle development board for drones, robots, and other applications. His theme was, “Why the IoT is broken, and how to fix it.”

The present state of IoT has “too much ‘I’” and “not enough ‘T’,” he asserted. Software is dynamic, while hardware is typically static, he said.

“What happens when they’re one and the same?” Cousins asked. Through the use of programmable logic, hardware can be reconfigured through software updates, he said.

The future? ARM-based field-programmable gate array system-on-a-chip devices, according to Cousins.

With such technology, hardware can be changed in the field, there are longer product lifecycles with design reuse and quicker time-to-market, resulting in “engaged and happy customers” and “expanded revenue streams,” he said.

Once a product is shipped, reconfigurable hardware can integrate machine vision, accelerate the use of complex algorithms, modify and add capabilities, and increase intelligence, Cousins added.

“Now THAT is an IoT,” he concluded.

IoT Security, Software Are Highlighted at ARM TechCon

Friday, November 13th, 2015

thumbnail

By Jeff Dorsch, Contributing Editor

Many people are aware of the Internet of Things concept. What they want to know now is how to secure the IoT and how to develop code for it.

Plenty of vendors on hand for the ARM TechCon conference and exposition in Santa Clara, Calif. this week were offering solutions on both counts. And there were multiple presentations in the three-day conference program devoted to both subjects.

Mentor Graphics, for instance, spoke about “Use Cases for ARM TrustZone Benefits of HW-Enforced Partitioning and OS Separation.” MediaTek presented on “Secured Communication Between Devices and Clouds with LinkIt ONE and mbedTLS.” And so on.

ARM CEO Simon Segars said in his keynote address that security and trust are one of the key principles in the Internet of Things (the others being connectivity and partnership across the ecosystem). Security and trust, he asserted, must be “at every level baked into the hardware, before you start layering software on top.”

James Bruce, ARM’s director of mobile solutions, addressed the security topic at length in an interview at the conference. ARM is taking a holistic approach to security through its TrustZone technology, he said, describing it as “a great place to put [network] keys.”

With microcontrollers, the chips often used in IoT devices, TrustZone makes sure sensitive data is “inaccessible to normal software,” Bruce said. At the same time, “you want to make devices easy to update,” he added.

ARM wants to enable its worldwide ecosystem of partners to stay ahead of cyberattacks and other online dangers, according to Bruce. “That’s why we’re doing the groundwork now,” he said.

The reaction of ARM partners to the introduction of TrustZone CryptoCells and the new ARMv8-M architecture for embedded devices has been “very positive,” Bruce said, adding, “Security can’t be an afterthought.”

Ron Ih, senior manager of marketing and business development in the Security Products Group at Atmel, described standard encryption as “only a piece” of security measures. “Authentication is a key part,” he said.

Atmel was touting its Certified-ID platform at ARM TechCon, featuring the ATECC508A cryptographic co-processor. Ih cited the “made for iPhone” chips that Apple requires of its partners developing products to complement the smartphone, ensuring ecosystem control. “You either have the chip or you don’t,” he said.

“People don’t care about the devices,” Ih concluded. “They care about who the devices are connected to.”

Simon Davidmann, president and chief executive officer of Imperas Software, is a veteran of the electronic design automation field, and he brings his experience to bear in the area of embedded software development.

Software, especially for the IoT, is “getting so complex, you can’t do what you used to do,” he said. “The software world has to change. Nobody should build software without simulation.”

At the same time, simulation is “necessary but not sufficient” in software development, he said. Code developers should be paying attention to abstractions, assertions, verification, and other aspects, according to Davidmann.

“Our customers are starting to adopt virtual platforms,” he added.

Jean Labrosse, president and CEO of MIcrium, a leading provider of real-time operating system kernels and other software components, said “the industry is changing” with the onset of the Internet of Things. Multiple-core chips are entering into the mix – not only for their low-power attributes, but for the safety and security they can provide, he noted.

Jeffrey Fortin, director of product management at Wind River and a specialist in IoT platforms, spoke on the last day of the conference on “Designing for the Internet of Things: The Technology Behind the Hype.”

Wind River, now an Intel subsidiary, has been around for more than three decades, developing “an embedded operating system that could be connected to other systems,” he said.

There are two business interests driving IoT demand, according to Fortin – business optimization and business transformation. He described the IoT as “using data to feed actionable analytics.”

The foundation of the IoT is hardware and software that provides safety and security, Fortin said.

Colt McAnlis of Google (Photo by Jeff Dorsch)

In the final keynote of ARM TechCon, Google developer advocate Colt McAnlis spoke on “The Hard Things About the Internet of Things.”

IoT technology, at present, is “not optimizing the user,” he said in a frequently funny and witty presentation. Networking and battery issues are bedeviling the IoT ecosystem, he asserted.

By draining the batteries of mobile devices with near-constant signals, such as setting location via GPS, companies are imposing “a taxation system for every single thing [IoT] does,” McAnlis said. “We’re talking about how often we’re sampling. People are already realizing this sucks.”

Beacons installed in a shopping mall can bombard smartphone users with advertising and coupons, he noted, while the property management gets data on specifics of foot traffic. “Imagine this at scale,” installed on every block of San Francisco, he added.

“We have a chance to not make this a reality,” McAnlis asserted. “We need IoT technology to make this not suck for users.”

At the end of his keynote, McAnlis asked the attendees to hold up their smartphones and vow, “I solemnly agree not to screw this up.”

ARM CEO Celebrates 500 Years of Connectivity

Wednesday, November 11th, 2015

By Jeff Dorsch, Contributing Editor

“Realize that everything connects to everything else,” Leonardo da Vinci said some five centuries ago.

Simon Segars, chief executive officer of ARM Holdings, took that quotation as the theme for his ARM TechCon keynote address on Wednesday morning (November 11), which was entitled “Building Trust in a Connected World.”

ARM CEO Simon Segars

“The future is dependent on the connections we make,” Segars commented.

He reviewed the history of significant products in the 20th century – automobiles, vacuum cleaners, DVD players, et al. – and noted how their pricing was reduced through “optimizing supply chains,” he said.

In 2015, “smartphones are essentially free,” Segars said. Pulling together all the capabilities and components that go into smartphones today would cost $3.56 million in 1990, the year ARM was established, he estimated. He displayed a RadioShack advertisement from 25 years ago with a page full of consumer electronics – all of which are now contained in smartphones.

In the 21st century, “the world has moved on,” Segars observed. Modern industry involves “planetary ecosystems,” he said, enabling worldwide contributions to developing the Internet of Things.

“Let’s take the opportunity to get IoT right,” Segars said, noting its development will depend on connectivity, based on common standards; security and trust; and partnerships across the ecosystem.

Automotive vehicles, medical electronics, and “smart cities” are key areas where the IoT will find growth prospects, the ARM CEO said.

“Cars are getting smarter,” Segars said, noting that the average vehicle contains hundreds of microcontrollers. It is estimated that 40 percent of the cars in the U.S. will have Long-Term Evolution (LTE) connectivity by 2019, he added.

As he went deeper into the topic of Internet-connected cars, a fire alarm went off in the crowded Mission City Ballroom of the Santa Clara Convention Center. Segars, the son of a fireman, directed the attendees to leave the building, interrupting the keynote address.

When the alarm proved to be false, the keynote resumed, with Segars bringing on three industry executives for a panel session. They were Paul Beckwith of the Progressive Group of Insurance Companies, Coby Sella of ARM, and Balaji Yelamanchili of Symantec.

“We talk about trust,” Sella said. “You have to analyze the risk factors.”

Yelamanchili said, “A lot of times, security is an afterthought.” For the IoT, security measures must be built into the chips and systems involved, he asserted.

To prevent data leakage, “these devices and how you connect these devices are purpose-built,” he added.

Beckwith said “our brand is at risk” if everything in the IoT is not secure.

Sella noted, “We are very much at the beginning” of IoT technology.

Segars asked the panelists what IoT will look like in five years.

“We have to do our best to make sure the security is built in,” Yelamanchili said. “There are enormous opportunities out there.”

Sella said, “We will start to see horizontal play in IoT. It depends on our ability to drive this forward.”

Beckwith commented that the industry will have to “react quicker” to security challenges and data-breach episodes.

MicroWatt Chips shown at ISSCC

Thursday, March 5th, 2015

thumbnail

By Ed Korczynski, Sr. Technical Editor

With much of future demand for silicon ICs forecasted to be for mobile devices that must conserve battery power, it was natural for much of the focus at the just concluded 2015 International Solid State Circuits Conference (ISSCC) in San Francisco to be on ultra-low-power circuits that run on mere microWatts (µW). From analog to digital logic to radio-frequency (RF) chips and extending to complete system-on-chip (SoC) prototypes, silicon IC functionality is being designed with evolutionary and even revolutionary reductions in the operational power needed.

The figure shows a multi-standard 2.4 GHz radio that was co-developed by imec, Holst Centre, and Renesas using a 40nm node CMOS process. This was detailed in session 13.2 when Y.H. Liu presented “A 3.7mW-RX 4.4mW-TX Fully Integrated Bluetooth Low-Energy/IEEE802.15.4/Proprietary SoC with an ADPLL-Based Fast Frequency Offset Compensation in 40nm CMOS.” It uses a digital-intensive RF architecture tightly integrated with the digital baseband (DBB) and a microcontroller (MCU), and the digital-intensive RF design reduces the analog core area to 1.3mm2, and the DBB/MCU/SRAM occupies an area of 1.1mm2. This is an evolution of a previous 90nm RF front-end design that results in a reduced supply voltage (20 percent), power consumption (25 percent), and chip area (35 percent).

Ultra-low-power multi-standard 2.4 GHz radio compliant with Bluetooth Low Energy and ZigBee, co-developed by imec, Holst Centre, and Renesas. (Source: Renesas)

“From healthcare to smart buildings, ubiquitous wireless sensors connected through cellular devices are becoming widely used in everyday life,” said Harmke De Groot, Department Director at imec. “The radio consumes the majority of the power of the total system and is one of the most critical components to enable these emerging applications. Moreover, a low-cost area-efficient radio design is an important catalyst for developing small sensor applications, seamlessly integrated into the environment. Implementing an ultra-low power radio will increase the autonomy of the sensor device, increase its quality, functionality and performance and enable the reduction of the battery size, resulting in a smaller device, which in case of wearable systems, adds to user’s comfort.”

When most ICs were used in devices and systems that were powered by line current there was no advantage to minimizing power consumption, and so digital CMOS circuits could be designed with billions of transistors switching billions of times each second resulting in sufficient brute-force power to solve most problems. With power-consumption now a vital aspect of much of the demand for future chips, this year’s ISSCC offered the following tutorials on low-power chips:

  • “Ultra Low Power Wireless Systems” by Alison Burdett of Toumaz Group (UK),
  • “Low Power Near-threshold Design” by Dennis Sylvester of University of Michigan, and
  • “Analog Techniques for Low-Power Circuits” by Vadim Ivanov of Texas Instruments.

Then on Thursday the 26th, an entire short course was offered on “Circuit Design in Advanced CMOS Technologies:  How to Design with Lower Supply Voltages.” with lectures on the following:

  • “A Roadmap to Lower Supply Voltages – A System Perspective” by Jan M. Rabaey of UC Berkeley,
  • “Designing Ultra-Low-Voltage Analog and Mixed-Signal Circuits” by Peter Kinget of Columbia University,
  • “ACD Design in Scaled technologies” by Andrea Baschirotto of University of Milan-Bicocca, and
  • “Ultra-Low-Voltage RF Circuits and Transceivers” by Hyunchoi Shin of Kwangwoon University.

µW SoC Blocks

Session 5.10 covered “A 4.7MHz 53µW Fully Differential CMOS Reference Clock Oscillator with -22dB Worst-Case PSNR for Miniaturized SoCs” by J. Lee et al. of the Institute of Microelectronics (Singapore) along with researchers from KAIST and Daegu Gyeongbuk Institute of Science and Technology in Korea. While many SoCs for the IoT are intended for machine-to-machine networks, human interaction will still be needed for many applications so session 6.7 covered “A 2.3mW 11cm-Range Bootstrapped and Correlated-Double-Sampling (BCDS) 3D Touch Sensor for Mobile Devices” by L. Du et. al. from UCLA (California).

As indicated by the low MHz speed of the clock circuit referenced above, the only way that these ICs can consume 1/1000th of the power of mainstream chips is to operate at 1/1000th the speed. Also note that most of these chips will be made using 90nm- and 65nm-node fab processes, instead of today’s leading 22nm- and 14nm-node processes, as evidenced by session 8.3 covered “A 10.6µA/MHz at 16MHz Single-Cycle Non-Volatile Memory-Access Microcontroller with Full State Retention at 108nA in a 90nm Process” by V.K. Singhal et al. from the Kilby Labs of Texas Instruments (Bangalore, India). Session 18.3 covered “A 0.5V 54µW Ultra-Low-Power Recognition Processor with 93.5% Accuracy Geometric Vocabulary Tree and 47.5 Database Compression” by Y. Kim et al. of KAIST (Daejeon, Korea).

In the Low Power Digital sessions it was natural that ARM Cortex chips were the basis for two different presentations on ultra-low power functionality, since ARM cores power most of the world’s mobile processors, and since the RISC architecture of ARM was deliberately evolved for mobile applications. Session 8.1 covered “An 80nW Retention 11.7pJ/Cycle Active Subthreshold ARM Cortex-M0+ Subsystem in 65nm CMOS for WSN Applications” by J. Myers et al. of ARM (Cambridge, UK). In the immediately succeeding session 8.2, W. Lim et al. of the University of Michigan (Ann Arbor) presented on the possibilities for “Batteryless Sub-nW Cortex-M0+ Processor with Dynamic Leakage-Suppression Logic.”

nW Beyond Batteries

Session 5.4 covered “A 32nW Bandgap Reference Voltage Operational from 0.5V Supply for Ultra-Low Power Systems” by A. Shrivastava et al. of PsiKick (Charlottesville, VA). PsiKick’s silicon-proven ultra-low-power wireless sensing devices are based on over 10 years of development of Sub-Threshold (Sub-Vt) devices. They are claimed to operate at 1/100th to 1/1000th of the power budget of other low-power IC sensor platforms, allowing them to be powered without a battery from a variety of harvested energy sources. These SoCs include full sensor analog front-ends, programmable processing and memory, integrated power management, programmable hardware accelerators, and full RF (wireless) communication capabilities across multiple frequencies, all of which can be built with standard CMOS processes using standard EDA tools.

Extremely efficient energy harvesting was also shown by S. Stanzione et al. of Holst Centre/ imec/KU Leuven working with OMRON (Kizugawa, Japan) in session 20.8 “A 500nW Battery-less Integrated Electrostatic Energy Harvester Interface Based on a DC-DC Converter with 60V Maximum Input Voltage and Operating From 1μW Available Power, Including MPPT and Cold Start.” Such energy harvesting chips will power ubiquitous “smarts” embedded into the literal fabric of our lives. Smart clothes, smart cars, and smart houses will all augment our lives in the near future.

—E.K.

Solid State Watch: Sept. 25-Oct. 3, 2014

Monday, October 6th, 2014
YouTube Preview Image

Qualcomm: Scaling down is not cost-economic anymore – so we are looking at true monolithic 3D

Monday, June 16th, 2014

By Zvi Or-Bach, President and CEO of MonolithIC 3D Inc.

Over the course of three major industry conferences (VLSI 2013, IEDM 2013 and DAC 2014), executives of Qualcomm voiced a call for monolithic 3D “to extend the semiconductor roadmap way beyond the 2D scaling” as part of their keynote presentations.

Karim Arabi, Qualcomm VP of Engineering, voiced the strongest support and provided many details of monolithic 3D’s role, in his keynote at this year’s DAC. A good summary was posted at the Tech Design Forums site under the title “3D and EDA need to make up for Moore’s Law, says Qualcomm.” In this blog, I’ll highlight some of the very interesting quotes from Arabi’s keynote: “Qualcomm is looking to monolithic 3D and smart circuit architectures to make up for the loss of traditional 2D process scaling as wafer costs for advanced nodes continue to increase. One of the biggest problems is cost. We are very cost sensitive. Moore’s Law has been great. Now, although we are still scaling down, it’s not cost-economic anymore”

Qualcomm is not the only fabless company voicing its concern with cost. Early in 2013 Nvidia said it is “deeply unhappy” and executives of Broadcom followed suite. The following chart, presented by ARM, illustrates it nicely.

But it seems that the problem is even more severe than that. In our blog Moore’s Law has stopped at 28nm we examined the expected increase of SoC cost due to poor scaling of embedded SRAM (eSRAM). We should note that the chart above, like many others, is about the cost per transistor associated with dimensional scaling. Escalating lithography cost causes escalating wafer cost, which neutralizes the 2X transistor density increases.

Yet eSRAM scales far less than 2X and, accordingly, for most SOCs, scaling would be even more costly. This issue has been confirmed again with the recent VLSI 2014 paper “10-nm Platform Technology Featuring FinFET on Bulk and SOI” by Samsung, IBM, STMicroelectronics, GLOBALFOUNDRIES and UMC. They presented that the size of their 10nm bitcell is 0.053 µm², which is only 25 percent smaller than the 0.07 µm² reported for 14nm bitcell size. One should expect that an additional area penalty would occur for effective use in large memory blocks, as reported even for 14nm, which could bring the effective SRAM scaling to only about 15 percent, a long way from the 50 percent required to neutralize the escalating wafer costs.

However, cost is not the only issue that forced Qualcomm to consider monolithic 3D. Quoting Arabi:

“Interconnect RC is inching up as we go to deeper technology. That is a major problem because designs are becoming interconnect-dominated. Something has to be done about interconnect. What needs to be done is monolithic three-dimensional ICs. Through-silicon vias and micro bumps are useful where you need I/Os … But they are not really solving the interconnect issue I’m talking about … So we are looking at true monolithic 3D. You have normal vias between different stacks. Then interconnect lengths will be smaller than with 2D. If we can connect between layers the delay becomes smaller.”

The interconnect issue was also addressed at IEDM 2013 by Geoffrey Yeap, Qualcomm VP of Technology, in his invited talk:

“As performance mismatch between transistors and interconnects continue to increase, designs have become interconnect-limited. Monolithic 3D (M3D) is an emerging integration technology poised to reduce the gap significantly between transistors and interconnect delays to extend the semiconductor roadmap way beyond the 2D scaling trajectory predicted by Moore’s Law.”

Yeap provided the following chart for the growing gap between transistor delay and interconnect delay:

Arabi DAC 2014 keynote was also reported on Cadence’s website, which provides our final Arabi quote for this blog: Qualcomm is looking at “monolithic” 3D-ICs that use normal vias between stacked dies. This can provide a one-process-node advantage along with a 30 percent power savings, 40 percent performance gain, and 5-10 percent cost savings.

Clearly, monolithic 3D integration has a very important role in the future of the semiconductor industry. It is therefore fitting that the traditional IEEE conference on SOI has extended its scope and now calls itself S3S: SOI technology, 3D Integration, and Subthreshold Microelectronics. The 2014 S3S conference is scheduled for October 6-9, 2014 at the Westin San Francisco Airport. This would be a great opportunity to learn more about monolithic 3D technology, with five invited presentations covering topics from design tools to monolithic 3D NAND and other 3D memories. CEA Leti will present their work on CMOS monolithic 3DIC, and researchers from MIT and Stanford will present manufacturing monolithic 3D devices with materials other than silicon.

Blog review April 22, 2014

Tuesday, April 22nd, 2014

Pete Singer blogs that it’s difficult to make interconnects much smaller without introducing significant increases in resistivity. At the upcoming IITC/AMC joint conference in May, many papers focus on new materials that could lead to reduced resistivity and enable further interconnect scaling. Most notably, graphene and CNTs provide an interesting alternative to copper.

Phil Garrou continues his analysis of the IMAPS Device Packaging Conference with a look at the presentations made by Flip Chip International and SUSS (the use of lasers in the manufacturing of WLP); GLOBALFOUNDRIES, Amkor and Open Silicon (a 2.5D ARM dual core product demonstrator which consists of 2 ARM die on a high density silicon interposer); Corning (results of multiple glass interposer programs) and Namics (underfill products for FC BGA and FC CSP).