These sensors will surely free up space in processor just as dedicated graphics and memory card. The multiple sensors take up processing power which hopefully will prove benefecial for users. The downside is the battery power as an extra chip will take up extra juice.

Cell phones are obviously the big dog for this, but I am also intrigued by the possibilities around IoT and other embedded applications. Are the APIs for this open? Could device developers develop their own sensor fusion algorithms? I can see, for example, this being used for motion integration on flying platforms between compass and optical flow cameras.

The ARM TechCon Expo was the perfect venue to demonstrate the first sensor hub for Android. Apple, Samsung and Windows 8 developers are all using sensor hubs to begin delivering always-on content aware services to mobile devices. And while Android had all the pieces in place, the Hillcrest/Amtel partnership makes it easier for independent Android developers using ARM's latest ultra-low-power Cortex M0+ processor to jump on the sensor hub bandwagon. ARM itself also partnered with Hillcrest to make it happen, according to ARM's director of Internet of Things Platforms, Simon Ford who said in a statement that the SH-1 can "turn the vision of 'always on', context-aware devices into reality."