Android Sensor Fusion Tutorial

While working on my master thesis, I’ve made some experiences with sensors in Android devices and I thought I’d share them with other Android developers stumbling over my blog. In my work I was developing a head tracking component for a prototype system. Since it had to adapt audio output to the orientation of the users head, it required to respond quickly and be accurate at the same time.

I used my Samsung Galaxy S2 and decided to use its gyroscope in conjunction with the accelerometer and the magnetic field sensor in order to measure the user’s head rotations both, quickly and accurately. To acheive this I implemented a complementary filter to get rid of the gyro drift and the signal noise of the accelerometer and magnetometer. The following tutorial describes in detail how it’s done.

There are already several tutorials on how to get sensor data from the Android API, so I’ll skip the details on android sensor basics and focus on the sensor fusion algorithm. The Android API Reference is also a very helpful entry point regarding the acquisition of sensor data. This tutorial is based on the Android API version 10 (platform 2.3.3), by the way.

This article is divided into two parts. The first part covers the theoretical background of a complementary filter for sensor signals as described by Shane Colton here. The second part describes the implementation in the Java programming laguage. Everybody who thinks the theory is boring and wants to start programing right away can skip directly to the second part. The first part is interesting for people who develop on other platforms than Android, iOS for example, and want to get better results out of the sensors of their devices.

Update (March 22, 2012):
I’ve created a small Android project which contains the whole runnable code from this tutorial. You can download it here:SensorFusion1.zip

Update (April 4, 2012):
Added a small bugfix in the examples GUI code.

Update (July 9, 2012):
Added a bugfix regarding angle transitions between 179° <–> -179°. Special thanks to J.W. Alexandar Qiu who pointed it out and published the soultion!

Update (September 25, 2012):
Published the code under the MIT-License (license note added in code), which allows you to do with it pretty much everything you want. No need to ask me first 😉

Sensor Fusion via Complementary Filter

Before we start programming, I want to explain briefly how our sensor fusion approach works. The common way to get the attitude of an Android device is to use the SensorManager.getOrientation() method to get the three orientation angles. These two angles are based on the accelerometer and magenotmeter output. In simple terms, the acceletometer provides the gravitiy vector (the vector pointing towards the centre of the earth) and the magnetometer works as a compass. The Information from both sensors suffice to calculate the device’s orientation. However both sensor outputs are inacurate, expecially the output from the magnetic field sensor which includes a lot of noise.

The gyroscope in the device is far more accurate and has a very short response time. Its downside is the dreaded gyro drift. The gyro provides the angular rotation speeds for all three axes. To get the actual orientation those speed values need to be integrated over time. This is done by multiplying the angular speeds with the time interval between the last and the current sensor output. This yields a rotation increment. The sum of all rotation increments yields the absolut orientation of the device. During this process small errors are introduced in each iteration. These small errors add up over time resulting in a constant slow rotation of the calculated orientation, the gyro drift.

To avoid both, gyro drift and noisy orientation, the gyroscope output is applied only for orientation changes in short time intervals, while the magnetometer/acceletometer data is used as support information over long periods of time. This is equivalent to low-pass filtering of the accelerometer and magnetic field sensor signals and high-pass filtering of the gyroscope signals. The overall sensor fusion and filtering looks like this:

So what exactly does high-pass and low-pass filtering of the sensor data mean? The sensors provide their data at (more or less) regular time intervals. Their values can be shown as signals in a graph with the time as the x-axis, similar to an audio signal. The low-pass filtering of the noisy accelerometer/magnetometer signal (accMagOrientation in the above figure) are orientation angles averaged over time within a constant time window.

Later in the implementation, this is accomplished by slowly introducing new values from the accelerometer/magnetometer to the absolute orientation:

// low-pass filtering: every time a new sensor value is available
// it is weighted with a factor and added to the absolute orientation
accMagOrientation = (1 - factor) * accMagOrientation + factor * newAccMagValue;

The high-pass filtering of the integrated gyroscope data is done by replacing the filtered high-frequency component from accMagOrientation with the corresponding gyroscope orientation values:

Assuming that the device is turned 90° in one direction and after a short time turned back to its initial position, the intermediate signals in the filtering process would look something like this:

Notice the gyro drift in the integrated gyroscope signal. It results from the small irregularities in the original angular speed. Those little deviations add up during the integration and cause an additional undesireable slow rotation of the gyroscope based orientation.

181 thoughts on “Android Sensor Fusion Tutorial”

Comment navigation

Hello!
I found your tutorial very interesting. I am doing a research which requires me to record limb movements. I am planning to do it using accelerometer and gyroscope. Is there any way where I can save the recorded data along with date and time of the day. (I need the time at which every movement takes place.)

Hi Sampreeti,
you can record any data produced by the sensors. You simply have to provide the required memory and after each measurement you save the sensor data alongside of the current timestamp. But keep in mind that the sensor fusion described in this tutorial only returns the orientation of the device, not its linear movement. But if you had several sensors (or sensor sets) and the distance between them, you could calculate the movement of a specific limb.

The practicality of MEMS sensors (such as common accelerometers or gyroscopes in mobile devices) depends heavily on the accuracy of the data you want to record. Maybe you don’t even require any sensor fusion and an accelerometer would suffice. However, if you require high data accuracy like in a motion capturing system, I doubt that MEMS sensors will fit your needs. In such a case I would recommend a computer vision approach.
Good luck with your research,
Paul

The magnetometer is used for azimuth stabilisation. This sensor provides the magnetic heading. However, I’m not sure whether azimuth (or orientation[0]) = 0° equals north. Maybe youwill have to do some post processing.

I found your tutorial very interesting and useful. Is it possible to track linear motion ? One more thing I dont understand, the fused data from the complementary filter, is it the accelerometer data without noise or gyroscope data without drift ?

The question about linear motion tracking is asked very often here. The answer is that I don’t see a reliable way to track linear motion with the sensor data we use in this tutorial. Most people who try to track linear motion start off with the accelerometer which is not possible with the current state of the art MEMS sensors. Perhaps there will be more accurate sensors one day. But as for now, I don’t think it’s possible.

The fused data is composed of both: low-pass filtered accelerometer/magnetometer data (not entirely without noise, only the high frequency parts removed) and high-pass filtered gyro data (removing the drift while the system moves slowly).

I would really appreciate if you could please give some pointers on the doubts that I have.

I have raw sensor data from Accelerometers, Gyroscope and Magnetometer from the Android. I am getting this data through an app Sensor Fusion. Now I want to process these data offline in Matlab to get accurate orientation. Is possible to obtain code for the functions that you have used such as SensorManager.getRotationMatrix . If I am not wrong, these functions come from Android but is it possible to see what they are doing so I can implement them in Matlab.

Hi Chintan,
the only way to get the code behind Android-APIs like SensorManager.getRotationMatrix I can think of is to look into the Android code itself. It’s open source:https://source.android.com/source/index.html
You could pull the source from Googles git repository, search for the SensorManager in it and look into the method in question.
Best Regards,
Paul

Chintan Shah, 2015-05-20 2.10 pm

Hello Paul

Thank you for your reply.

I am new to this topic and need some clarification about some terms.

1. So the idea behind fusion is to get better orientation estimation, is that correct? And for that people either use Kalman or Complementary filter.
2. I saw the popular video on Youtube by David Sachs and he says that in order to obtain Linear Acceleration, we need to estimate Gravity and subtract it from Raw acceleration. How do I estimate Gravity from Raw acceleration? I came across the following link and please tell me if ti is correct.http://developer.android.com/guide/topics/sensors/sensors_motion.html#sensors-motion-accel
3. And how to estimate heading?

Hi Chintan,
1. correct.
2. yes, the described method looks like a sound way to get the acceleration without gravitational influence
3. how would you define the term ‘heading’. Do you mean the overall orientation of the device or rather the azimuth (i.e. the ‘compass’ component of the orientation)? In both cases the answer is already at hand (see sensor fusion or determination of orientation.

Chintan Shah, 2015-05-27 10.17 am

Hello Paul

Thank you for your reply. I just have a last question. I came across a TYPE_ROTATION_VECTOR which outputs orientation in terms of Quaternions. Have you looked at this output and does it behave exactly the same on all devices?

I still think I should implement my own Orientation estimation but if the TYPE_ROTATION_VECTOR does the same job then not sure if it is worth spending time on orientation estimation.

Hi,
at the time I wrote the article, the rotation vector sensor type was not available in Android. I had no chance to evaluate its behaviour and the quality of its output. You should try it out, do some tests and evaluate whether the data quality meets the requirements of your application. It could save you some time.
Best Regards
Paul

Chintan Shah, 2015-05-29 4.20 pm

Hello Paul,

I have implemented your code in Matlab and it works really well. I just have few question:

I walked few steps in a straight line and recorded raw data from Accel, Magnetometer and Gyro and then passed them to the Complementary filter.

The only problem is that I was expecting the Yaw angle to be nearly constant but it deviates towards the end and I found that the magnetometer data also deviates towards the end. How can I fix this?

If possible, can I send you the plots by email?

Many thanks.

Best Regards

Chintan

Samuel, 2015-06-05 1.30 am

Hi Paul and Chintan

I am trying to understand how the Rotation Vector behaves in order to estimate orientation of the wrist using a Android wear device. It tells you the angle you have rotated it with respect to a fixed frame in the three axes. Have you found something relevant Chintan? Regards.