Walking the aisles at CES, you are hard-pressed to find a single product that doesn’t contain at least one sensor. The latest iPhones add a barometric sensor to at least a dozen others. By some predictions, a trillion-sensor world is not far off. Yet what benefits, really, will this ubiquity of sensors deliver? We put this question, and others, to the speakers at the Sensors and MEMS Technology conference.

To Karen Lightman, Executive Director of the MEMS Industry Group, the answer lies in pairing sensors with data analytics. She notes that “MEMS and sensors are already fulfilling the promise to make the world a better place, from airbags and active rollover protection in our cars to the smart toaster that ensures my daughter’s morning bagel won’t be burnt. By combining sensors with data analytics, we can increase that intelligence exponentially.”

An example is biometric measurements, which traditionally suffer from undersampling. Your doctor checks your pulse or blood pressure just once in a while, whereas a typical day may see wild fluctuations. David He, Chief Scientist at Quanttus Inc., predicts a convergence between consumer and clinical use of wearable sensors. Noting that cardiovascular disease and other chronic conditions often go undiagnosed, he foresees ICU-quality wearable sensors that measure your vital signs as you undergo daily activities, relying on enormous datasets to detect problematic patterns. “While everyone is looking for the killer app in wearables,” he urges, “we should be looking for the un-killer app.”

Ben Waber, CEO of Sociometric Solutions, puts sensor data to a radically different use. His firm outfits employees of large companies with sensor-equipped badges that track their interactions. “In any industry the interaction between employees is the most important thing that happens at work,” he told CNN. His badges use motion sensors to follow users as they mix with others in the office and to monitor their posture while seated (slouching suggests low energy). A microphone measures tone of voice, rapidity of speech, and whether a person dominates meetings or allows others to speak in turn.

Waber claims employees can use the results to improve performance and job satisfaction. “You can see the top performers and change your behavior accordingly, to be happier and more productive. In a retail store, you might see that you spend 20% of your time talking to customers, but the guy who makes the most commission spends 30%.” He adds, “I can point to thousands of people who say they like their jobs better.”

Steven LeBoeuf, president of Valencell, points to a problem he calls “death by discharge,” meaning the tendency of novel wearables to “land in the sock drawer before insights can be made” because users tire of keeping them charged. His firm promotes a category he calls “hearables”: sensors added to earphones—powered from a standard jack—that measure pulse, breathing, blood pressure, and even blood-oxygen saturation, all from gossamer-thin vessels on the ear called “arterioles.” Yet measurements alone, he cautions, fall short without comparative analytics. “Human subject testing is a different animal altogether…extensive human subject validation is required for accurate biometric sensing.”

Data is moving from physical to mental. Rana el Kaliouby’s company, Affectiva, combines sensor data with analytics to monitor emotional states, detecting stress, loneliness, depression, and productivity. She foresees a sensor-driven “emotion economy” where devices act on our feelings. She told The New Yorker, “We put together a patent application for a system that could dynamically price advertising depending on how people responded to it.”

Indeed, patent filings abound for mood-sensing devices. Anheuser-Busch’s application for an “intelligent beverage container,” notes that without it, sports fans at games “wishing to use their beverage containers to express emotion are limited to, for example, raising a bottle to express solidarity with a team.”

Now stonily indifferent to our feelings, our devices may acquire an almost-human sympathy. “I think that, ten years down the line,” predicts Affectiva’s Kaliouby, “we won’t remember what it was like when we couldn’t just frown at our device, and our device would say, ‘Oh, you didn’t like that, did you?’”

Originally posted by Michael E Stanley of Freescale Semiconductor in The Embedded Beat on Mar 12, 2013

In Orientation Representations Part 1 and Part 2, we explore some of the mathematical ways to represent the orientation of an object. Now we’re going to apply that knowledge to build a virtual gyroscope using data from a 3-axis accelerometer and 3-axis magnetometer. Reasons you might want to do this include “cost” and “cost”. Cost #1 is financial. Gyros tend to be more expensive than the other two sensors. Eliminating them from the BOM is attractive for that reason. Cost #2 is power. The power consumed by a typical accel/mag pair is significantly less than that consumed by a MEMS gyro. The downside of a virtual gyro is that it is sensitive to linear acceleration and uncorrected magnetic interference. If either of those is present, you probably still want a physical gyro.

So how do we go from orientation to angular rates? It’s conceptually easy if you step back and consider the problem from a high level. Angular rate can be defined as change in orientation per unit time. We already know lots of ways to model orientation. Figure out how to take the derivative of the orientation and we’re there!

In our prior postings, we’ve discussed a number of ways to represent orientation. For this discussion, we will use the basic rotation matrix. Jack B. Kuipers has a nice derivation of the derivative of direction cosine matrices in his “Quaternions and Rotation Sequences” text – one of my most used textbooks. It makes a good starting point. Paraphrasing his math:

Those who take the time to check will note that we have inverted the polarity of the ω in Equation 11 from that shown in the prior posting. In that case ω was the angular velocity of the body frame in the fixed reference frame. Here we want it from the opposite perspective (which would match gyro outputs).

And again,

dvf/dt = 0 so

dvb/dt = ω X vb

Equating equations 10 and 13:

ω X vb = (dRMt/dt) RMt-1vb

ω X = (dRMt/dt) RMt-1

where:

0

-ωz

ωy

ω X =

ωz

0

-ωx

-ωy

ωx

0

Going back to the fundamentals in our first calculus course and using a one-sided approximation to the derivative:

dRMt/dt = (1/Δt)(RMt+1 – RMt)

where Δt = the time between orientation samples

ωb X = (1/Δt)(RMt+1 – RMt) RMt-1

Recall that for rotation matrices, the transpose is the same as the inverse:

RMtT = RMt-1

ωb X = (1/Δt)(RMt+1 – RMt) RMtT

Equation 15 is a truly elegant equation. It shows that you can calculate angular rates based upon knowledge of only the last two orientations. That makes perfect intuitive sense, and I’m ashamed when I think how long it took me to arrive at it the first time.

An alternate form that is even more attractive can be had by carrying out the multiplications on the RHS:

ωb X = (1/Δt)(RMt+1 RMtT – RMt RMtT)

ωb X = (1/Δt)(RMt+1 RMtT – I3×3)

For the sake of being explicit, let’s expand the terms. A rotation matrix has dimensions 3×3. So both left and right hand sides of Eqn. 22 have dimensions 3×3.

(1/Δt)(RMt+1 RMtT – I3×3) = (1/Δt) W

0

W1,2

W1,3

W = RMt+1 RMtT – I3X3 =

W2,1

0

W2,3

W3,1

W3,2

0

The zero value diagonal elements in W result from small angle approximations since the diagonal terms on RMt+1 RMtT will be close to one, which will be canceled by the subtraction of the identity matrix. Then:

0

-ωz

+ωy

0

W1,2

W1,3

ω X =

+ωz

0

-ωx

= (1/Δt)

W2,1

0

W2,3

-ωy

+ωx

0

W3,1

W3,2

0

and we have:

ωx= (1/2Δt) (W3,2 – W2,3)

ωy= (1/2Δt) (W1,3 – W3,1)

ωz= (1/2Δt) (W2,1 – W1,2)

Once we have orientations, we’re in a position to compute corresponding angular rates with

One 3×3 matrix multiply operation

3 scalar subtractions

3 scalar multiplications

at time each point. Sweet!

Some time ago, I ran a Matlab simulation to look at outputs of a gyro versus outputs from a “virtual gyro” based upon accelerometer/magnetometer readings. After adjusting for gyro offset and scale factors, I got pretty good correlation, as can be seen in the figure below.

You will notice that we started with an assumption that we already know how to calculate orientation given accelerometer/magnetometer readings. There are many ways to do this. I can think of three off the top of my head:

The expressions shown generally rely on a small angle assumption. That is, the change in orientation from one time step to the next is relatively small. You can encourage this by using a short sampling interval. You should soon see an app note that my colleague Mark Pedley is working on that discards that assumption and deals with large angles directly. I like the form I’ve shown here because it is more intuitive.

Noise in the accelerometer and magnetometer outputs will result in very visible noise in the virtual gyro output. You will want to low pass filter your outputs prior to using them. Mark will be providing an example implementation in his app note.

This is one of my favorite fusion problems. There’s a certain beauty in the way that nature provides different perspectives of angular motion. I hope you enjoy it also.

Stop by the booth to register to win MOD Live from Recon Instruments (winner of the 2011 MEMS Technology Showcase at MEMS Executive Congress®) and see the VGo Robotic Telepresence (enabled by MEMS developed by Freescale Semiconductor.)

Connecting the Real World with the Digital World: Harnessing the Power of MEMSLVCC, North Hall, Room N254January 11, 2012 | 10:30-11:30amSpread the word to your colleagues and customers to attend this much anticipated session. In this session, you’ll earn how MEMS is truly driving the adoption of new consumer applications and products.

Join press and members of MIG for a complimentary continental breakfast and networking time. (RSVP required)

MEMS at 2012 CES® PressroomIf you can’t make it to the show, be sure to check out the latest press releases related to MEMS at 2012 CES® in our online press room. MIG members, if you are attending the show and have a press release, make sure to send it to Kacey Wherley at kwherley@memsindustrygroup.org for inclusion.

MEMS vendors range from large multi-national and multi-product suppliers such as STMicroelectronics, Bosch, Texas Instruments, and Freescale Semiconductors, down to suppliers focused on the MEMS market with relatively small product portfolios such as VTI, InvenSense, and Memstech. Each vendor is vying for a slice of a market that will be worth more than $1.5 billion in 2016 for MEMS sensor and audio devices in smartphones and tablets alone.

Certain segments of the market have emerged with continued strong growth potential, including MEMS inertial sensors and microphones. The smartphone and media tablet markets are the driving forces behind this growth. “The MEMS market is going through a transition period, as many other semiconductor market segments have when approaching maturity,” says Peter Cooney, practice director, semiconductors. “Leading vendors understand that to be successful in consumer electronics markets, you have to have economies of scale and be able to supply a broad range of solutions.”

As markets mature, component integration is the key to success, reducing BOM cost and board space while offering customers ease of design and reduced time to market. To this end, vendors are racing to diversify and increase product portfolios. This is driving M&A activity in the MEMS market. Over the next few years, the number of vendors addressing high volume MEMS markets will shrink as larger suppliers acquire companies to increase product offerings and use their expanding portfolios to further integrate and achieve market dominance.

ABI Research provides in depth analysis and quantitative forecasting of trends in global connectivity and other emerging technologies. From offices in North America, Europe, and Asia, ABI Research’s worldwide team of experts advises thousands of decision makers through 40+ research and advisory services. Est. 1990. For more information, visit www.abiresearch.com or call +1.516.624.2500.