The core building blocks for contextual awareness and intelligence emerge from understanding the immediate situation of an end-user. Knowing a user’s physical location, time-of-day, calendar and data associated with movement of the mobile device captures a large swath of information required by predictive algorithms that make contextually relevant suggestions or automatically execute actions on behalf of the user. This situational device data, in turn, will be augmented by visual and aural information to develop more nuanced, semantically rich descriptions of a user’s current environment and likely intent.

The technology industry is already in the midst of a race to connect mobile devices and their physical environments with sensors, beacons and other data gathering and broadcasting technologies. Not just smartphones and tablets, but fixed locations and everyday objects are gaining the ability to communicate wirelessly with each other and with the end user. PwC expects this race to accelerate.

This article focuses mainly on physical context information harvested and packaged by accelerometers, gyroscopes and other mobile device sensors that create a picture of the state of the device.