Part of the  

Solid State Technology


About  |  Contact

Skin-Like Biocompatible Devices Come to Market – and to FLEX/MSTC

February 1st, 2019

By Maria Vetrano

As director of the Center for Bio-Integrated Electronics at Northwestern University, Professor John A. Rogers explores soft materials for conformal electronics, nanophotonic structures, microfluidic devices and MEMS, all with an emphasis on bio-inspired and bio-integrated technologies. During his keynote at FLEX and MEMS & Sensors Technical Congress 2019, February 18-21 in Monterey, Calif., Rogers will present examples of the diverse, novel classes of biocompatible electronic and microfluidic systems with skin-like physical properties that stem from his work in materials science, mechanical engineering, electrical engineering and advanced manufacturing. SEMI’s Maria Vetrano caught up with Rogers to discuss his research, which has already been commercialized by companies such as L’Oréal Group.

SEMI: What is the concept behind skin-interfaced electronic and microfluidic devices?

ROGERS: Biological systems are mechanically soft, with complex, time-dependent 3D curvilinear shapes. Modern electronic and microfluidic technologies are rigid, with simple, static 2D layouts. We believe that eliminating this profound mismatch in physical properties will create vast opportunities in microsystems technologies (electronics, optoelectronics, microfluidics and microelectromechanical devices) that can intimately integrate with the human body for diagnostic, therapeutic or surgical functions. Skin-like devices that assess blood-glucose levels in real-time or continuously monitor the vital signs of infants in neonatal intensive care are just two examples of non-invasive, wirelessly connected biocompatible devices with the potential to dramatically improve quality of life.

SEMI: What are some examples of commercially available biocompatible/microfluidic wearables that have leveraged your research?

ROGERS: We’ve been fortunate in that we have been able to translate some of our ideas into commercial products for broad deployment in both life-enhancing and potentially life-saving applications. In sports and fitness, our skin-interfaced microfluidic systems form the basis of soft devices that capture, store and perform in-situ chemical analysis of sweat. These devices have been launched as products in two different categories – cosmetics and athletics – with two global brands. As an example of the former, L’Oréal Group just unveiled at CES 2019 My Skin Track pH, a thin, flexible version of this technology, designed to determine skin pH from measurement of sweat pH. Once armed with this information, L’Oréal customers can choose skincare products matched to their personal body chemistry. See the video on this device. Notably, a globally recognized consumer brand will reveal a product for athletics around the time of the 2019 Super Bowl on Sunday, February 3.

A look inside My Skin Track pH, which uses Rogers Research Group technology from the Center for Bio-Integrated Electronics at Northwestern University

Our technologies also have applications in clinical medicine and rehabilitation, including soft, skin-interfaced wireless sensors used to assess patient progress in stroke rehabilitation. In contrast with conventional, wired sensors that tether the patient to external boxes of electronics (a design that makes such devices impractical for in-home use), or conventional wearables that are confined to the wrist, our systems apply to the skin like a BAND-AID, and are described as “imperceptible” by stroke patients who are using them during rehab. These platforms measure speech, swallowing capability, movement of limbs, sleep quality, walking and balancing. Healthcare professionals can use the information collected to continue to monitor patients when they leave medical facilities, to understand how patients function in the real world. See video.

SEMI: What work are you doing beyond flexible devices?

ROGERS: We are pursuing devices that are unique not due to their soft mechanics, but due to their extremely small sizes. A good example is My Skin Track UV, which we recently commercialized with L’Oréal’s La Roche-Posay. This millimeter-scale, wireless, battery-free platform for digital UV dosimetry measures UV exposure dose continuously in real time and provides user access to this information via a smartphone app. My Skin Track UV is now available at all Apple stores across the U.S. and through the Apple website. See video.

L’Oréal’s La Roche-Posay My Skin Track UV

Other biocompatible/microfluidic devices based on our technology provide functionality that can save lives. Hydrocephalus patients suffer from a condition that, if unchecked, leads to excessive buildup of fluid in the brain. If left untreated, the resulting pressures can prove fatal.

Hydrocephalus is treated with shunts, which drain accumulated fluid away from the intracranial space to a distal part of the body, often the abdomen. Unfortunately, however, shunts have a nearly 100 percent fail rate over a 10-year period, and testing them typically requires an MRI, CT scan or even surgery. Our technology serves as the basis of a bandage-sized, skin-like sensor that applies to the surface of the skin on the neck. Within five minutes of placement on the skin, the sensor can test non-invasively to determine if fluid is flowing through the shunt. The net result uniquely supports the rapid evaluation of shunts from home or other non-medical settings. The devices free patients from the constraints of hospitals, giving them a greater sense of security and independence. See video.

SEMI: What would you like FLEX and MSTC attendees to take away from your presentation?

ROGERS: I would like attendees to know that biocompatible microfluidic and electronic wearables that are flexible and conformal to the human body are no longer risky futuristic technologies that exist only in academic labs: They are emerging right now as key products in commercial markets for flexible hybrid electronics (FHE) and MEMS/sensors. Our group alone is anticipating deployment at the scale of tens to hundreds of millions of units in the markets in which we are seeing traction over the next five years. We believe that the broader area will become a multi-billion-dollar market opportunity in five to 10 years.

John Rogers, Ph.D. will present Soft Electronic and Microfluidic Systems for the Skin atFLEX/MSTC on Tuesday, February 19 at 10:30 am.

Register today to connect with him at the event. To learn more about Rogers Research Group, click here.


Maria Vetrano is a public relations consultant at SEMI.

Getting to Low Power in IoT/IIoT Devices

November 30th, 2018

By Luca Fontanella and Simone Ferri

Over the last three years the number of battery-operated electronic-component solutions for the Internet of Things (IoT) and Industrial IoT (IIoT) applications has been increasing steadily. This trend will continue for years to come, particularly with the growing popularity of mobile devices of all flavors. Addressing power consumption for battery-powered always-on IoT/IIoT devices – which rely on dozens of electronic components, including sensors — is critical to their commercial success.

The demand for ultra-low-power sensors has accelerated the race to squeeze every last mW from components. Compared to previous-generations of sensors, semiconductor suppliers have managed to drastically reduce power by as much as 50%-60% over older solutions. Leveraging new state-of-the-art analog design techniques, we have effectively optimized capacitive readings of MEMS structures. How effective are they? We estimate that with the right mix of our company’s power-saving technologies, it is possible our customers could save 3MW/year globally[1].

What’s next?

While the semiconductor industry continues to investigate novel technologies, approaches and analog IP for greater energy efficiency, we believe that bigger gains in reducing power consumption will come from thinking at the system level. The sensor node is a good place to start.

A typical IoT node is composed of a set of sensors, a microcontroller, a radio frequency (RF) link, and a power-supply system, often based on Li-Po batteries.

Of these, the microcontroller and RF link consume the most energy and, in the RF link, power consumption is a function of the distance between end point and receiver and of the amount of data transmitted. Thus, at longer distances reducing the amount of data transmitted can save power. We can achieve this by including some pre-elaboration capabilities on-board and by extracting more meaningful information from the raw sensor data.

We address this by moving some computation and data analysis inside the sensors, where smart hardware “digital blocks” perform faster and more efficiently than software-based routines running in the microcontroller. We can achieve this by using dedicated hardware resources to reduce overall system power consumption. The beauty of this solution is that it allows the microcontroller to operate in low-power states by only transmitting significant information in batches. The SensorTile development kit can speed up prototyping of ultra-low-power IoT devices by integrating an ultra-low-power MCU and BlueNRG Bluetooth radio with sensors. Some examples of these advanced digital blocks are the Advanced Embedded Pedometer, the Finite State Machine and Decision Tree, and Compressed FIFO in an IMU.

The Advanced Embedded Pedometer is a hard-wired step counter that works independently inside the sensor, without CPU intervention: By comparing sensor outputs to pre-defined and -loaded patterns, it autonomously decides whether the user is walking or running to start and stop counting the user’s steps. The sensor then makes this information available to the microprocessor for further elaboration or for simple notification to the user.

The Finite State Machine and Decision Tree are new functions dedicated to pattern recognition (machine learning) and decision-making: They can perform complex classifications and state detection, and can send dedicated warning and signaling to the microprocessor. A good real-world example is industrial predictive maintenance, where the sensor can categorize and identify different malfunctioning states in the equipment before waking the microprocessor to react.

Our products, on average, save about 1 mA (1e-3) over competitive devices or over our previous-generation parts. So 2.0 x 1e-3 x 1.5e9 = 3MW.

Integrating programmable sensors and decision trees as well as finite state machines in the sensor allows the sensor to do more of the work while the MCU sleeps. Source: STMicroelectronics

Another example is compressed FIFO (first-in, first-out) buffer, which can store sensor data in the sensor, not in raw format, by using efficient compression algorithms. In addition to saving memory (and therefore silicon area) inside the sensor chip, it also saves power by reducing the number of bytes transferred to the processor and by shortening the communication data flow, which reduces processor-active time.

These examples – the Advanced Embedded Pedometer, the Finite State Machine and Decision Tree, and compressed FIFO buffer – are just some showing that we can develop low-power IoT/IIoT devices through intelligent management of sensors, microcontrollers and other components in any given system. Your starting point is an IoT/IIoT node that lets you selectively allocate some power-hungry tasks — such as computation and data analysis — to sensors instead of the microcontroller. Leveraging data blocks that reside in the sensors alleviates the microcontroller’s typical power drain, allowing the microcontroller to operate with maximum efficiency.

[1] ST sells about 1.5 billion pieces/year (1.5e9), which typically run from a 2V supply.

Luca Fontanella joined ST Microsystems in 1995 as an analog designer. In 2001 he joined the MEMS team in a marketing role and today he is marketing manager in the MEMS Sensor Division. Luca has contributed to 25+ international patents and has presented at multiple conferences. He earned a degree in Electronic Engineering from Padua University.


Simone Ferri joined STMicroelectronics in 1999 as Central R&D engineer, moved to the Audio Division as a digital designer and is now director of the Consumer MEMS Business Unit. He holds a degree in Electronic Engineering and an MBA from the Polytechnic of Milan.

Emerging MEMS and Sensor Technologies to Watch – 2019 and Beyond

November 14th, 2018

By Dr. Alissa M. Fitzgerald, founder and managing member, A.M. Fitzgerald & Associates, LLC

When developing industry forecasts, market analysts gather data from hundreds of companies to provide actionable insights on established technologies and to identify near-term business opportunities. As a developer of new MEMS and sensor technologies for a range of commercial applications, clients often ask us, “What’s going to be hot?” Gauging the promise of emerging technologies that are five to 10 years from commercialization requires taking a different tack.

History tells us that most of today’s blockbuster MEMS products were born as academic research projects. Years of hard work by entrepreneurs, funded by millions of dollars, have turned proof-of-concept research into new commercial products. To identify up-and-coming technologies, we gather information straight from the source: academic conferences and articles.

Chirp Microsystems is a good proof point of our research methodology: In my 2012 report on emerging technologies, I highlighted research from UC Berkeley and UC Davis on “In-Air Ultrasonic Rangefinding and Angle Estimation Using an Array of AlN Micromachined Transducers.” Soon after publication, the authors incorporated Chirp Microsystems to commercialize their technology for gesture- and fingerprint-recognition applications.

After five years of development work, Chirp’s products are entering the marketplace. In February 2018, the global supplier TDK InvenSense acquired Chirp, underscoring the company’s commercial potential. At October’s SEMI-MSIG MEMS & Sensors Executive Congress in Napa, Calif., Chirp’s CEO, Dr. Michelle Kiang, held attendees rapt as she described her company’s journey from startup to wholly owned subsidiary.

There’s a method

This year, I reviewed over 100 papers from top researchers presenting noteworthy technologies at the Hilton Head Workshop on Solid-State Sensors, Actuators and Microsystems. My criteria for selection were: commercial relevance; offers a solution to a known or anticipated problem; and technology game-changers. The following caught my eye:

  • Event-driven sensors: Cleverly designed silicon MEMS that consume no power while standing by. A triggering mechanical or thermal event closes a contact within the sensor to activate its circuitry and telemetry. These sensors leverage existing fabrication methods, so they could become commercial products within five years for event monitoring and security applications. (UT Dallas, Northeastern University)
  • Thin film piezoelectric resonators: Advances in PZT deposition methods and process integration with CMOS were used to create monolithic acoustic waveguides for RF filtering in 5G applications. This new filter design, using existing scalable processes, is ripe for commercialization. (Purdue University, Texas Instruments)
Figure: 5-bit accelerometer having zero standby power. The device is open circuit until a threshold acceleration closes a mechanical contact. Source: University of Texas at Dallas.
  • Intra-body communications: MEMS ultrasound transceivers, made from aluminum nitride, can send data directly through flesh at Mbit/s data rate. With trends toward networks of multiple implanted or wearable medical devices, this innovation would enable medically safe, secure, intra-body wireless communication. This early-stage work still needs in vivo validation and would likely require 10 or more years for development and regulatory approval. (Northeastern University)
  • Screen- and 3D-printed sensors: One example of many exciting innovations using screen- and 3D-printing are potentiometric nitrate soil sensors. Low-cost and biodegradable, these sensors could be spread over huge areas to monitor a farm’s soil quality. Table-top and hobbyist tools are currently used to make screen- and 3D-printed devices, so new manufacturing equipment and infrastructure must be developed before commercial production could occur. (Purdue University)
  • Biodegradable batteries: A paper-based battery that can deliver 0.5 uW of power, ingeniously using bacterial metabolism as the electrolyte. These batteries dissolve in water and could one day be used to power temporary medical implants or biodegradable sensors. This exciting proof-of-concept prototype will require significant process development and new manufacturing infrastructure
Figure: Paper-based battery dissolves in 60 minutes after immersion in water. Source: SUNY Binghamton

To read more about these technologies, please download my presentationfrom SEMI-MSIG’s MEMS & Sensors TechXpotat SEMICON West 2018.

Alissa M. Fitzgerald, Ph.D., is the founder and managing member of A.M. Fitzgerald & Associates, LLC, a MEMS and sensors development company in Burlingame, CA. She has over 20 years of engineering experience in MEMS design, fabrication and product development and now advises clients on the entire cycle of product development, from business and IP strategy to manufacturing operations. She is a frequent speaker at industry conferences and currently serves as a director of the Transducer Research Foundation, sponsor of the Hilton Head Workshop.She received her bachelor’s and master’s degrees from MIT and her doctorate from Stanford University in Aeronautics and Astronautics.

For more information, visit:

Cybersecurity and Industry-Government Collaboration Hot Topics at MEMS & Sensors Executive Congress 2018

November 2nd, 2018

By Maria Vetrano

SEMI-MEMS & Sensors Industry Group (MSIG) welcomed a global group of industry executives to its 14th annual MEMS & Sensors Executive Congress (MSEC), October 29-30, 2018 in Napa, Calif. MEMS and sensors represent a robust sector of the electronic industry. Analyst firm Yole Développement expects the global market for MEMS and sensors to double in the next five years, reaching $100B by 2023, spurred by growth of autonomous mobility products such as Internet of Things (IoT) devices, autonomous cars, fitness and healthcare wearables, and agricultural sensors.

“From drones that navigate any terrain in all lighting conditions, robo-taxis that ‘smell’ cigarette smoke, and sensors that monitor animal welfare and food safety, MSEC speakers shared inventive use cases representing new opportunities for MEMS and sensors suppliers,” said Carmelo Sansone, director, MEMS & Sensors Industry Group. “Our keynote speakers spurred attendees to collaborate for the greater good. MITRE Corp. cybersecurity expert Cynthia Wright exhorted attendees to proactively address cybersecurity. DARPA Microsystems Technology Office (MTO) program manager Ron Polcawich invited participation in a rapid innovation and production concept that could dramatically speed design cycles for new MEMS. They exemplify the cross-pollination among commercial industry, government and academia that will continue to advance MEMS and sensors.”

Getting serious about cybersecurity

MITRE cybersecurity expert Cynthia Wright opened MSEC 2018 with a keynote on cybersecurity, alerting attendees to a topic that few in the industry have explored in-depth — but to which they need to pay attention.

“Billions of connected mobile devices democratize knowledge, diversity and boost economies, and accelerate innovation by connecting humans to one another and to our environments,” said Wright. “At the same time, they easily create huge networks that carry operationally and personally sensitive data.”

Because MEMS and sensors are deeply embedded into this vast array of connected devices, industry needs to get involved now or risk potentially grave consequences, claimed Wright. “From the destruction of critical infrastructure, cyberattacks on life-critical medical devices such as insulin pumps and heart monitors, and intrusions on autonomous vehicle safety systems, MEMS and sensors suppliers have a responsibility to help improve cybersecurity of connected devices,” she added.

Allaying the potential fears of a roomful of suppliers envisioning complete redesigns of their products, Wright said that not every device requires the same level of security, and suppliers can make a difference with even “minor tweaks.”

Wright suggested encryption at the edge and process authentication. She also gave MSEC attendees a list of design precepts:

  • Build it in. Don’t bolt it on — Design your device with security in mind instead of retrofitting it after-the-fact to realize the most elegant design.
  • Beware of shadow IT — You can’t protect what you don’t know about. Consider physical asset security; software/sensor-guided decision-making; personal or operational data collection; and key process control.
  • Realize your points of vulnerability — because MEMS and sensors are susceptible to spoofing.
  • Learn from cyberattacks of the past — even if they have not been tied directly to MEMS/sensors.
  • Understand IoT software — Realizing that IoT software acts on what the hardware tells it, pay attention to altered sensor data that can lead to altered system performance.

When asked about the role of US government regulation on the security of connected devices, Wright acknowledged that Europe has more restrictive cybersecurity guidelines than the US.

“At the same time, it does not make sense to have two different approaches to cybersecurity of devices. US suppliers who implement more security measures can sell to both markets and to other parts of the world.”

If she could leave MSEC attendees with a closing thought, it might be that companies “don’t need to put a firewall on a toaster.”

“Not every chip has to be secure-foundry secure, but it would be nice if even 10% could hit that mark,” added Wright.

Rapid Innovation through Collaboration

IC designers typically enjoy three to four design cycles in a calendar year, leading to swift advancement of electronics over subsequent years.

Designers in the MEMS community, however, generally have access to one design cycle or less per year, and typical time-to-market is four years for a new product. That slow fabrication pace has hindered deployment of innovative MEMS designs — and it’s something that MSEC closing keynote speaker, Ron Polcawich, program manager, DARPA MTO, would like to change.

Polcawich’s vision of government collaboration with industry and academia spawned the investigational Rapid Innovation through Production MEMS (RIPM) Workshop, which Polcawich and his team held in May 2018. During his keynote, Polcawich shared lessons learned from the workshop while inviting MSEC attendees to get involved.

Before RIPM can become a program, Polcawich knows it will require definition. What would a program concept look like? What is the best way to articulate the potential benefits to the MEMS community, and what additional inputs would be needed?

“This is a daunting challenge from a program planning perspective,” said Polcawich. “In developing RIPM, we realized that we needed representatives from the entire MEMS ecosystem – integrated device manufacturers, or IDMs, equipment suppliers, foundries, and materials’ providers — to literally come to the table to tackle a common goal. Given the potential for the MEMS industry at large to benefit from rapid innovation and production, we hoped that competitors would realize that leveraging established MEMS processes could deliver significant benefits over the historically entrenched approach: one product, one process.”

Polcawich believes that MEMS suppliers might relinquish the one product, one process paradigm if they knew that their IP were secure.

“While technical challenges to realizing RIPM abound, we knew that we could tap the MEMS industry’s vast knowledge base to address them,” he said. “IP protection is an equally complex issue, and one that may bear a range of approaches. One model could ensure that each IDM owns their IP while the foundry owns the process technology, which it licenses to other companies through process development kits. In addition to speeding innovation, it also provides new revenue sources for the industry.”

Polcawich sees RIPM as a win-win for both commercial industry and for the DoD. Speeding design-to-deployment of new MEMS devices could open new and larger markets to MEMS suppliers. It could also support greater product-line diversification and new revenue streams for foundries and other ecosystem members. The DoD could tap new MEMS devices for strategically important applications like tactical radios, unmanned aircraft systems such as drones, and image autofocus for cameras. Polcawich encouraged SEMI-MSIG members to get involved by emailing his group:

New Hall of Fame Members

Three new industry leaders joined the SEMI-MSIG Hall of Fame, first established in 2011 as a means of honoring those who have made a substantial contribution to SEMI-MSIG. Selected by members of the Governing Council, 2018 Hall of Fame inductees include:

  • Michelle Bourke, strategic marketing director, Customer Support Business Group, Lam Research
  • Eric Pabo, business development manager, MEMS, EV Group
  • Yoshio Sekiguchi, senior strategic advisor, TDK InvenSense

Technology Showcase Winner

MSEC recognizes the latest advancements in applications enabled by MEMS and sensors — including those demonstrated by entrepreneurs competing in the Technology Showcase. Selected by a committee of industry experts, five finalists did their best to impress attendees with their technical approach and go-to-market strategies.

The 2018 Technology Showcase winner, Alertgy, presented a biosensor-based wristband device that provides non-invasive, real-time blood glucose monitoring for people with type 2 diabetes, which affects more than 20 million Americans and hundreds of millions more worldwide.

MSEC 2019 will take place October 22-24, 2019, at the Coronado Island Marriott Resort & Spa in Coronado, Calif., just minutes from downtown San Diego.

For more information on MSEC 2019 and other SEMI-MSIG events and programs, please follow @MEMSgroup on Twitter, visit MSIG at SEMI and subscribe to SEMI’s weekly newsletter, SEMI Global Update.

Maria Vetrano is a public relations consultant at SEMI.

Autonomous Mobility and the New Age of Sensor Fusion

October 29th, 2018

By Maria Vetrano

Marcellino Gemelli, director of global business development at Bosch Sensortec, will present at the upcoming MEMS & Sensors Executive Congress on October 29-30, 2018 in Napa, Calif. SEMI’s Maria Vetrano caught up with Gemelli to give MSEC attendees a preview of Gemelli’s feature presentation.

Sensor fusion — the integration of different types of sensors through software algorithms to increase overall system performance and/or reduce power consumption— has come a long way since its inception. In those early days, sensor fusion generally involved MEMS inertial sensors only. The advent of new sensor varieties, including environmental sensors, is making new use cases a reality. Gemelli will explore the ways in which the next generation of sensor fusion is improving autonomous mobility devices.

SEMI: Why are environmental sensors important to autonomous mobility devices?

Gemelli: When most of us think of autonomous systems, we think that they are driven by motion sensors and proximity sensors (e.g., radar, Lidar). When vertical location comes into play, however, in applications such as drones or asset tracking, pressure sensors become an integral part of flight control, navigation and positioning in GPS-challenged areas.

While not commonly considered an electronically enabled sense, the ability to “smell” the environment opens new opportunities. The quality of a user’s experience with personal cleaning robots and robo-taxis are good examples of where we might want to enable scent detection.

SEMI: I’ve never thought much about using sensors to detect smell. How would a robo-taxi or a cleaning robot benefit from scent detection?

Gemelli: Fully autonomous cars will inevitably give rise to robo-taxis. In fact, last month Volvo announced its fully electric robo-taxi, and in March 2018 Waymo announced that Jaguar Land Rover’s SUV would join Fiat Chrysler’s Chrysler Pacifica minivans in its planned fleet of robo-taxis, so we may see robo-taxis in the U.S. within the next five years.

With robo-taxis fast-approaching, we need technologies that provide the same level of oversight that a taxi driver once fulfilled. Gas sensors would function like an electronic nose (e-nose) in a robo-taxi to inform the taxi’s owner of prohibited passenger behavior, such as eating, drinking or smoking in the vehicle, which could potentially damage the vehicle’s interior. Camera sensors could record the act as proof of the offense.

Cleaning robots would be more sophisticated than they are today. In addition to leveraging image and range-finding sensors to more accurately map the rooms in your house, they could also detect scents from spilled red wine, pet urine or other foreign materials. When the cleaning robot, such as a vacuum, detects the foreign substance, it would navigate around the substance instead of going through it and spreading it all over the carpet.

In addition to robo-taxis and cleaning robots, I will also discuss asset tracking and drones.

SEMI: What role does sensor fusion play in autonomous mobility devices?

Gemelli: Combining sensor fusion with artificial intelligence (AI) will generate new use cases and therefore new markets for sensor suppliers.

There is another major benefit as well. With so many connected devices in our lives — including those with cameras, location awareness and always-listening capabilities — we are seeing growing concern about user privacy. Sensor fusion and AI can help to alleviate this concern: By supporting more local processing, they allow for greater control of data, safeguarding personal privacy.

SEMI: Who is responsible for the AI part of the sensor-fusion equation?

Gemelli: AI is a new frontier for MEMS and sensors suppliers. It benefits us and our customers to embrace AI algorithms through in-house development and/or partnerships.

SEMI: What would you like MEMS & Sensors Executive Congress attendees to take away from your presentation?

Gemelli: I plan to issue a call to action to increase research in hybrid sensor-fusion software architectures, including AI, as suppliers’ collaboration will benefit the industry at large.

Marcellino Gemelli is currently based in Palo Alto (CA) responsible for business development of Bosch Sensortec’s MEMS product portfolio.  He received the ‘Laurea’ degree in Electronic Engineering at the University of Pavia, Italy while in the Italian Army and an MBA from MIP, the Milano (Italy) Polytechnic business school.  He previously held various engineering and product management positions at STMicroelectronics from 1995 to 2011 in the fields of MEMS, electronic design automation and data storage. He was contract professor for the Microelectronics course at the Milano (Italy) Polytechnic from 2000 to 2002.

Marcellino Gemelli will present Environmental Sensors Systems Enabling Autonomous Mobility on Tuesday, October 30 at MEMS & Sensors Executive Congress in Napa Valley, Calif.

Register today to learn more about the connection between sensor fusion, AI and next-generation autonomous mobility devices.

Sensors in the New Age of the Car

September 11th, 2018

By Richard Dixon, Senior Principal Analyst, Sensors, IHS Markit

Sensors are inextricably linked to the future requirements of partially and fully autonomous vehicles. From highly granular dead-reckoning subsystems that rely on industrial-strength gyroscopes for superior navigation to more intelligent and personalized cockpits featuring intuitive human machine interfaces (HMIs) and smart seats, new generations of partially and fully autonomous cars will use sensors to enable dramatically better customer experiences.

Dead reckoning, or, where am I, exactly?

Dead reckoning is the process of calculating one’s current position by using a previously determined position, and advancing that position based upon known speeds over a time slice. As a highly useful process, dead reckoning is the basis for inertial navigation systems in aerospace navigation and missile guidance, not to mention your smartphone.

Today’s best-in-class MEMS gyroscopes can offer 30-50 cm resolution (this is the yaw rate drift) over a distance of 200 m—a typical tunnel length where a GPS signal is lost. For semi-autonomous (L3) or autonomous (L4, L5), the locational accuracy is well below 10 centimeters; that’s an accuracy usually reserved for high-end industrial or aerospace gyroscopes with a raw bias instability ranging from 1°/h and down to 0.01°/h. These heavy-duty gyros command prices from $100s up to $1000s.

Current performance levels of different gyroscopes by application and performance measure in terms of bias drift (IHS Markit).

This poses an interesting potential opportunity for both industrial-performance MEMS-based gyroscope sensor-makers, such as Silicon Sensing Systems, Analog Devices, Murata, Epson Toyocom and TDK InvenSense, and for broader-based sensor component-makers such as Bosch, Panasonic, STMicroelectronics, and TDK (InvenSense and Tronics).

While MEMS can master performance, size and low weight, cost remains the challenge. The fail-operational mode requirement for autonomous driving will accommodate higher prices, at least in the beginning, probably in the $100+ range at first, even for the relatively low volumes of self-driving cars anticipated by 2030. Nonetheless, automotive volumes are very attractive compared to industrial applications and offer a lucrative future market for dead-reckoning sensors.

Your cockpit will get smarter

Automakers are banking on the idea that people like to control their own physical environment. Interiors already feature force and pressure sensors that provide more personalized seating experiences and advanced two-stage airbags for improved safety. In some vehicles, automakers are using pairs of MEMS microphones for noise reduction and image or MEMS infrared sensors for detection of driver presence. Eventually, we might see gas sensors that monitor in-cabin CO2 levels, triggering a warning when they detect dangerous levels that could cause drowsiness. These smart sensors would then “tell” the driver to open the window or activate an air-scrubbing system in a more complex solution. While today’s CO2 sensors are still relatively expensive, we may see them designed-in as lower-cost versions come to market.

Future cockpits will need to go beyond such concepts in the lead-up to fully automated driving. Seats could contain sensitive acceleration sensors that measure heart and respiration rates as well as body movement and activity. Other devices could monitor body humidity and temperature.

We need look no further than Murata, a supplier initially targeting hospital beds with a MEMS accelerometer as a replacement for pulse oximeters. That same Murata accelerometer could be placed potentially in a car seat to detect heart rate. It’s not the only way to do this: another sensing approach for heart-rate measurement comprises millimeter wave radiation, a method that can even look through objects such as books and magazines.

Augmenting sensor-based body monitoring, automotive designers will use cameras to fuse information such as gaze direction, rate of blinking and eye closure, head tilt, and seat data with data gathered by sensors to provide valuable information on the driver’s physical condition, awareness and even mood.

Faurecia’s Active Wellness concept—unveiled at the 2016 Paris Motor Show—proves that this technology might be coming sooner than we think. Active Wellnesscollects and analyzes biological data and stores the driver’s behavior and preferences. This prototype provides data to predict driver comfort based on physical condition, time of day, and traveling conditions, as well as car operating modes: L3, L4 or L5. Other features such as event-triggered massage, seat ventilation and even changes in ambient lighting or audio environment are possible.

Faurecia’s “cockpit of the future,” announced at CES 2018. (Faurecia).

Meanwhile, there are other commercial expressions of more advanced HMI as well as plenty of prototypes. Visteon’s Horizon cockpit can use voice activation and hand gestures to open and adjust HVAC. Capacitive sensors are already widely used for touch applications, and touchless possibilities range from simple infrared diodes for proximity measurement to sophisticated 3D time-of-flight measurements for gesture control.

Clearly, automotive designers will have a lot more freedom with HMI in the cabin space, providing a level of differentiation that manufacturers think customers will appreciate—and for which they will pay a premium.

Managing Sensor Proliferation

Researchers are investigating ways to solve the issue of high-functionality vehicles containing myriad sensing inputs, i.e., when we have so many sensing inputs, designers must address wiring complexity and unwanted harness weight. Faurecia, for example, is considering ways to convert wood, aluminum, fabric or plastic into smart surfaces that can be functionalized via touch-sensitive capacitive switches integrated into the surface. These smart surfaces could reduce the explosion of sensing inputs, thereby diminishing wiring complexity. With availability from 2020, Faurecia’s solutions are approaching the market soon.

Beyond functionalized switches, flexible electronics and wireless power sources, and even energy harvesting (to mitigate power sources), could provide some answers. Indeed, recent research has shown that graphene-based Hall-effect devices can be embedded in large-area flexible Kapton films, and eventually integrated into panels. OEMs such as Jaguar Land Rover are interested in such approaches to address the downsides of electronics and sensor proliferation, especially in luxury vehicles. While smart surfaces would represent a big change in sensor packaging and a disruption in current semiconductor processes, they remain a long way from commercial introduction.

By 2030 or thereabouts, fully autonomous cars that detect our mood, vital signs and activity level could well be available. Cabins could signal us to open the window if CO2 levels become dangerous. HVAC systems could increase seat ventilation or turn up the air conditioning (or the heat) based on our body temperature. Feeling too hot or too cold in the cabin could become a thing of the past, at least for the driver, whose comfort level is the most important! We could feasibly feel more comfortable in the car than in our office, our home or at the movies. Perhaps our car will become our office, our entertainment center and our home away from home as we take long road trips with the family, without a single passenger uttering, “Are we there yet?”

Richard Dixon, Ph.D., is a senior principal analyst for MEMS research and author of more than 50 MEMS-related consulting and market research studies. He is a renowned expert on automotive MEMS and magnetic sensors used in safety, powertrain and body applications. Along with supporting the overall activities of the MEMS and sensors group, his responsibilities include the development of databases that forecast the markets for more than 20 types of silicon-based sensors in more than 100 automotive applications. In addition, he has supported organizations with future scenarios for sensors in cars and has supported many custom projects for companies in the automotive supply chain.

In his prior post at Wicht Technologie Consulting (WTC), Dixon was a senior MEMS analyst where he led research on physical sensors and was the co-author of the NEXUS Task Force Report for MEMS and Microsystems 2005-2009. He has also led commercialization and road-mapping activities on European Commission-funded technology projects, including detailed MEMS chip cost analysis studies.

Dixon worked previously as a journalist in the compound semiconductor industry and has five years of experience as a technology transfer professional at RTI International, where he provided business and market intelligence for early-stage technologies.

Dixon graduated from University of Greenwich with a degree in materials science and earned a doctorate from Surrey University in semiconductor characterization. He speaks English and German.

For more information, visit:

When Will Self-Driving Cars Become a Reality?

June 13th, 2018

By Stephen Breit, Senior Director, MEMS Business, Coventor, a Lam Research Company

Self-driving cars have been all the rage in both the trade and popular press in recent years. I prefer the term “autonomous vehicles,” which more broadly captures the possibilities, encompassing not only small passenger vehicles but mass transit and industrial vehicles as well. Depending on who’s talking, we will all be riding in fully autonomous vehicles in five to 25 years.

The five-year estimates come from startups eager to raise venture capital while the 25-year estimates stem from Tier 1 automotive suppliers who tend to be more conservative in outlook. Regardless of the timeframe, a multitude of investors – national governments, venture capitalists and companies – are dedicating significant capital and effort to make autonomous vehicles a reality.

I must admit that I did not fully grasp the enthusiasm for self-driving cars until last year. First, I’ve always enjoyed driving, unless I’m in stop-and-go traffic, so I couldn’t imagine relinquishing the task. Second, I’ve deliberately arranged my life to spend minimal time in my car. However, traffic has become much heavier in my metropolitan area (Boston), and I know that many people in cities around the world face longer commutes and waste more time in gridlock.

What is the solution to this problem that is only getting worse? I had an epiphany while walking through Shinigawa Station in Tokyo, one of the busiest train stations in the world. Dense streams of people crisscrossed the station on their individual paths, managing to avoid collisions without the aid of traffic controls. Evidently, humans have an innate collision-avoidance ability that makes traffic controls for pedestrian crowds unnecessary. If autonomous vehicles could achieve the same excellence in collision-avoidance, we could potentially reduce or eliminate traffic controls for vehicular traffic, providing a huge gain in transportation efficiency and relief from gridlock.

Sensors as core building blocks

New and improved sensors, many based on micro-electromechanical systems (MEMS) technology, are key to achieving this vision. While MEMS inertial sensors (such as accelerometers and gyros) are already integral to the core safety systems in conventional vehicles, they are also essential to improved self-navigation in autonomous vehicles.

The challenge for MEMS suppliers is to deliver inertial sensors that meet the requirements for self-navigation systems, which are different and more demanding than for safety systems.

Pinpointing a vehicle’s position requires “dead reckoning” based on inertial sensor signals as a supplement to GPS input. Undesirable drift in the inertial sensor signals due to mechanical quadrature, temperature sensitivity and noise can quickly add up to a large error in position that may result in a collision. To meet the more rigorous requirements for autonomous vehicles, suppliers must design MEMS inertial sensors that are substantially more precise and resistant to drift. This requires design software that is both extremely accurate and fast, as well as increasingly precise and reliable manufacturing capabilities.

Other MEMS-based devices, such as micromirrors and micro ultrasound transducers (MUTs), are also promising options for implementing vision and range-finding systems in autonomous vehicles. These sensing systems are needed for building electronic versions of the human collision-avoidance abilities that I witnessed in Shinigawa Station – and it is these systems that autonomous vehicles must emulate.

When will self-driving cars become a reality? Aside from the provocative question that got you to read this far, I don’t have a definitive answer. It will undoubtedly occur in phases, ranging from the driver-augmentation systems available in today’s cars to the full autonomy and ubiquity that will allow reduction of traffic controls in 20 years or more. It is clear that the ultimate goals for autonomous vehicles are highly worthwhile, and that achieving those goals will require better-performing and more diverse MEMS sensors.

MEMS-based sensing systems are essential to autonomous vehicles.

Stephen (Steve) Breit, Ph.D. has been responsible for overseeing development and delivery of Coventor’s industry-leading software tools for MEMS design automation since joining Coventor in 2000. Steve holds numerous patents on software systems and methods for MEMS design automation and virtual fabrication. He holds a Ph.D. in Ocean Engineering from MIT and a B.S. in Naval Architecture and Marine Engineering from Webb Institute.
For more information, visit:

MEMS Fabrication: Growth-Enabler or Industry Roadblock?

May 7th, 2018

By D.Sc.(Tech.) Heikki Holmberg, senior development manager technology, Okmetic Oy 

The MEMS industry has huge growth potential. Will MEMS fabrication act as a bottleneck to continued expansion or a critical conduit to achieving that potential? Slow development cycles, multiple fabrication platforms and high cost for small R&D volumes are barriers to rapid development of new products. Understanding the special features of MEMS fabrication — with its many ecosystem options — will help your company to navigate successfully these challenges as you more quickly develop new and unique products.

The sheer diversity and varying requirements of MEMS devices and the one product, one process approach are the root causes of most MEMS fabrication challenges. While a single approach will not suit all companies, forming an ecosystem that leverages different companies’ expertise is one of the best ways to address these challenges. However, knitting together this ecosystem is difficult because having multiple partners in the mix only works if the entire supply chain follows common basic design rules and a common top-level technology development roadmap. Because establishing these commonalities takes time and effort, many large- and medium-sized companies prefer to own their supply chain, regardless of the costs. In contrast, emerging companies that cannot support heavy capital investments in new equipment will inevitably find foundries that have all the equipment in place — as well as a wide variety of MEMS processes — a more attractive option. As you embark on a MEMS fabrication journey, which options should you consider to stay ahead of the pack?

Finding ecosystem partners

Since there are so many different technology choices that make process integration difficult in MEMS fabrication, technology know-how is the key to developing unique products in time. If you are not able to own your supply chain, you must find ecosystem partners whose expertise both matches and complements your technology (Table 1).

Supply Chain Option Option 1 Option 2 Option 3 Option 4 Option 5
Company Expertise Fully owned component supply chain. Outsourced assembly and ASIC design only. Outsourced MEMS-fabrication by pure-play MEMS foundry. Outsourced MEMS-fabrication and MEMS design by MEMS foundry. Purchased ready-made MEMS chips.
MEMS chip fabrication 2 1 3 2 3
MEMS chip design + testing 2 1 1 2 3
ASIC design 2 3 2 1 2
Assembly 2 3 2 2 1
System-level know-how of the component 2 2 2 2 1

Table 1: Ecosystem choices as a function of company expertise. 1=Excellent fit 2=Good fit 3= Ok fit.

You must also understand how your company can add value — either directly to the end-product or to the other partners in the ecosystem. Above all, there must be trust among partners. A lack of mutual trust will lead to inadequate information-sharing and cumulative knowledge-gathering — slowing problem-solving and/or causing excessively long development cycles.

Option 1: Own your supply chain

While few companies can support Option 1 — having the whole supply chain in-house (like Bosch or STMicroelectronics) — the benefits are many: all know-how will be contained within the company, IP is easy to protect, and supply-chain management is simpler. However, this model demands a significant investment in tools and, in the long run, substantial effort and money to remain technologically competitive in each part of the supply chain.

Option 2: Outsource the ASIC

You might opt for a fully owned supply chain — outsourcing only ASIC fabrication and possibly ASIC design and assembly. This option requires significant expertise in MEMS technology, freeing you from the limitations of a fabless operation model as you gain more control of MEMS fabrication processes. It also offers more IP protection than you would have with a foundry.

While the disadvantages of Option 2 are similar to the fully owned supply chain model, you can mitigate them by outsourcing part of the MEMS chip fabrication supply chain or outsourcing some development to wafer supply companies that can handle the customer-designed embedded structures inside the wafer and/or multi-stack wafer packages. Outsourcing will shorten the process flow and reduce the amount of capital required for growth. A third outsourcing option is to farm out especially difficult or incompatible steps, delivering multiple benefits such as access to better materials, including specialized polymers – which are typically more expensive than silicon – and precious metals, such as gold or platinum, which can contaminate equipment during thin-film deposition processes.

Source: Okmetic Oy Source: Okmetic Oy

Options 3-4: Foundry models

While more companies still own the whole supply chain or outsource the ASIC portion of their device than use MEMS foundries, the MEMS foundry solution is still an especially good option for cost-conscious companies. If you cannot afford the substantial investment needed for new tools and/or advanced materials or your product requires rapid scaling-up because of short lifecycles, you should explore foundry solutions. There are two main types: pure-play foundries (Option 3) and a foundry with design services and its own IP (Option 4). Option 3 offers no design services, nor does it provide its own designs. It is an excellent choice for a MEMS design-based company lacking its own MEMS fabrication line. However, if your company lacks in-depth knowledge of MEMS design, Option 4 will give you design support and possibly some IP as well.

Okmetic operators inspect silicon wafers to ensure they meet customer specifications for quality and performance.  Source: Okmetic Oy Okmetic operators inspect silicon wafers to ensure they meet customer specifications for quality and performance.
Source: Okmetic Oy

Option 5: Buy the MEMS

Option 5 is to buy ready-made MEMS chips and use them as the foundation to build your component. This is solid choice if your company is high in the value chain or has system-level expertise.

In the future, MEMS ecosystem players will win by offering both design-library development and a supporting portfolio of MEMS process design kits — just as the IC industry now does. This winning approach will significantly shorten the MEMS product-design cycle – from idea to process development and finished product – and will ultimately change the rules of the game for the MEMS fabrication industry.

About the author

Based in Vantaa, Finland, D.Sc.(Tech.) Heikki Holmberg develops new business opportunities for Okmetic’s high-performance silicon wafers. He also manages Okmetic’s research portfolio, including European Union- and nationally funded research projects.

For more information, visit:

AI and MEMS Sensors: A Critical Pairing

April 5th, 2018

By Kaustubh Gandhi, Senior Product Manager, Bosch Sensortec

Source: Bosch Sensortec Source: Bosch Sensortec

Artificial intelligence (AI) is making headlines everywhere, offering a range of capabilities, including location and motion awareness — determining whether a user is sitting, walking, running or sleeping. Behind the scenes, AI is capturing volumes of data. Makers of smartphones and fitness and sports trackers, along with application developers, are all clamoring for this data because it helps them analyze real-world user behavior in depth. Manufacturers gain a competitive edge by tapping this intelligence: Using it to improve user engagement, they increase the perceived value of their devices, potentially reducing customer churn.

How can consumer-product manufacturers tap the built-in capabilities of MEMS inertial sensors — which are already ubiquitous in end-user devices — to make the most of AI?

Machine learning

Product manufacturers can easily build an activity classification engine using commonly available smart sensors and open-source software. Activity trackers, for example, use raw data first collected via the MEMS inertial sensors that are already installed in smartphones, wearables and other consumer products.

With the building blocks in place, consumer-product manufacturers can apply machine learning techniques to classify and analyze this data. There are several possible approaches, ranging from logistic regression to deep learning neural networks.

One well-documented method used for classifying sequences in AI is Support Vector Machines (SVM). Physical activities, whether walking or playing sports, consist of specific sequential repetitive movements that MEMS sensors gather as data. MEMS sensors make good use of this collected data, which can be easily processed into well-structured models that are classifiable with SVMs.

Consumer-product manufacturers have gravitated toward the SVM model since it is easy to use, scale and predict. Using an SVM to set up multiple simultaneous experiments for optimizing classification over diverse, complex real-life datasets is far simpler than with other approaches. An SVM also introduces a wide range of size and performance optimization opportunities for the underlying classifier.

Cost impacts of processing, storage and transmission

In practice, recognizing user activity hinges on accurate live classification of AI data. Therefore, the key to optimizing product cost is to balance transmission, storage and processing costs without compromising classification accuracy.

This is not as simple as it sounds. Storing and processing AI data in the cloud would leave users with a substantial data bill. A WiFi, Bluetooth or 4G module would drive up device costs and require uninterrupted internet access, which is not always possible.

Relegating all AI processing to the main processor would consume significant CPU resources, reducing available processing power. Likewise, storing all AI data on the device would push up storage costs.

Resolving the issues

To resolve these technology conflicts, we need to do four things to marry the capabilities of AI with MEMS sensors.

First, decouple feature processing from the execution of the classification engine to a more powerful external processor. This minimizes the size of the feature processor size while eliminating the need for continuous live data transmission.

Next, reduce storage and processing demands by deploying only the features required for accurate activity recognition. In one example created by UC Irvine Machine Learning Repository (UCI), when an AI model was trained using a dataset of activities with 561 features, it identified user activity with an accuracy of 91.84 percent. However, using just the 19 most determinative features, the model still achieved an impressive accuracy of 85.38 percent. Notably, pre-processing alone could not identify these determinative features. Only sensor fusion enabled the data reliability required for accurate classification.

Third, install low-power MEMS sensors that can incorporate data from multiple sensors (sensor fusion) and enable pre-processing for always-on execution. A low-power or application-specific MEMS sensor hub can slash the number of CPU cycles that the classification engine needs. The onboard software can then directly generate fused sensor outputs at various sensor data rates to support efficient feature processing.

Finally, retrain the model with system-supported data that can accurately identify the user’s activities.

Functional process for activity classification (Source: Bosch Sensortec) Functional process for activity classification (Source: Bosch Sensortec)

Additionally, cutting the data capture rate can reduce the computational and transmission resource requirements to a bare minimum. Typically, a 50 Hz sample rate is adequate for everyday human activities. This may soar, however, to 200 Hz for fast-moving sports. Reducing dynamic data rate selection and processing in this way lowers manufacturing costs while making the product lighter and/or more powerful for the consumer.

High efficiency in processing AI data is key to fulfilling its potential, driving down costs and delivering the most value to consumers. MEMS sensors, in combination with sensor fusion and software partitioning, are critical to driving this efficiency. Operating at very low power, MEMS sensors simplify application development while accurately analyzing motion sensor data.

Combining AI and MEMS sensors into a symbiotic system promises a new world of undreamt-of opportunities for designers and end users.

Based in Reutlingen, Germany, Kaustubh Gandhi is responsible for the product management of Bosch Sensortec’s software. For more information, visit:

This blog post is based on an original article that first ran in EDN. It appears here with the permission of the publisher.


What’s Next for Smart Speakers? Smarter Microphones

March 16th, 2018

By Matt Crowley, CEO, Vesper Technologies

Smart speakers and voice assistants are already a big part of everyday life for many of us. Improvement in speech recognition accuracy obtained from advancements in natural language processing, machine learning and cloud computing technologies is driving the success of voice assistants. We’re asking Siri to play music, Alexa to order kitchen supplies and OK, Google for the weather. The world’s largest consumer electronics tradeshow, CES, was monopolized by voice assistants this year.

But what’s behind the smart speakers? Even smarter microphones. There are two different kinds of tiny microphones in our smart devices, including smartphones, smart home products and smart speakers – capacitive and piezoelectric MEMS (microelectro-mechanical systems) microphones. MEMS microphones offer high signal-to-noise-ratio (SNR), low power consumption, good sensitivity, and are available in very small packages that are compatible with surface mount assembly processes, according to EDN Network. Capacitive MEMS microphones have been the industry standard for 50 years, until recently. A new player hit the scene in the last couple of years – the piezoelectric MEMS microphone.

Piezoelectric MEMS microphones are transforming the capabilities of smart speakers by offering better far-field performance, ruggedness and extreme durability over time. In fact, piezoelectric MEMS mics, for example, are natively immune to environmental contaminants such as dust, water, humidity, oil and even beer. Piezoelectric MEMS microphones offer significant power savings over battery-powered smart speakers compared to capacitive-based “always on, always listening” solutions. That means that the microphone is absorbing virtually no power until it’s turned on by a “wake word” such as “Hey Siri.”

Vesper’s piezoelectric MEMS microphones enable smart speakers and other voice-enabled applications Vesper’s piezoelectric MEMS microphones enable smart speakers and other voice-enabled applications

Another crucial advantage of piezoelectric MEMS comes from the inherent linearity of piezoelectric transduction that can withstand extreme sound pressure levels without saturating the microphones. What this means to smart speakers is that the audio quality, particularly the bass response, on the speakers need not be compromised to avoid saturation of microphones in music barge-in scenario. From a consumer perspective, this feature translates to higher wake-word detection accuracy without compromising on audio quality while playing music at loud volume levels. All of these advantages when integrated into microphone arrays lead to improved speech recognition accuracy and consistent long-term performance, a rare combination we think is best achieved with piezoelectric microphones.

These different types of sensors can significantly increase the utility rates of smart speaker products in a household, a major challenge that smart speaker developers are trying to solve. Imagine a smart speaker that can interact and move along with you to teach yoga or an Echo Dot in your bedroom that can seamlessly communicate the temperature and/or humidity level to a thermostat without any user interaction. While motion sensors can help create an emotional bond with the user, environmental sensors on-device can offload some of the communication to the cloud or another IoT hub, thereby reducing the latency and power consumption. Some of these features are currently only limited to highly priced niche products, but one can expect the proliferation of these devices into the mass market in the years to come.


Amazon’s smart speakers – such as Echo Dot – are always-listening devices. Amazon’s smart speakers – such as Echo Dot – are always-listening devices.

Amazon’s first-mover advantage resulted in its large market share within the smart speaker segment. Alexa Voice Services’ growing third-party integrations and rapidly evolving ecosystem of connected smart home services indicate a strong foothold for Amazon. Information plays a key role in the race for marketshare in these connected services, and MEMS/sensors are at the forefront of this information-gathering process. Adoption of a wide variety of sensors, including technologies such as piezoelectric MEMS sensors, can provide significant value and competitive advantage in data science.


Based in Boston, MA, Matt Crowley is CEO of Vesper Technologies. For more information, visit:

This blog post was originally published on the SEMI Blog.

Next Page »

Extension Media websites place cookies on your device to give you the best user experience. By using our websites, you agree to placement of these cookies and to our Privacy Policy. Please click here to accept.