Saying hello to the future of mobile.

iPhone X is here. Pronounced “iPhone 10”, the latest incarnation of Apple’s iconic smartphone is the tenth anniversary edition following its launch a decade ago.

The big changes that are most noticeable to the user are:

All-screen design: the usable area of the phone has been maximised and Apple have said goodbye to the home button that has been a consistent feature of every iPhone until now. We had our suspicions when the iPhone 7’s home button wasn’t an actual button but a clever trick using the Taptic engine.

Face ID replaces Touch ID with the loss of the home button, which Apple say brings stronger security. A new camera means the phone will scan your face for security authentication and it won’t be duped by a photograph or mask (though watch out for evil twins).

Animoji lets you animate emojis controlled by your own facial expressions, which means yes, you can be an animated chicken or even an animated ?!

Though behind the scenes are where things get really interesting. Powering these new features are some pretty sophisticated new additions to the iPhone’s built in technology. Apple’s renowned in-house CPU development team have continued to impress experts, and this has now led to them producing their most recent processor as well as now developing their own GPU. In addition they have also included a built in neural engine, which will really take the Machine Learning and CoreML features up a ‘notch’. Along with a new camera system, these are some cutting-edge tools:

A11 Bionic processor has a built-in neural engine. Performing up to 600 billion operations per second, it is designed for specific machine learning (ML) algorithms and is also used for powering the brand new Face ID and Animoji.

TrueDepth Camera system with dot projector, infrared camera and flood illuminator. “Your face is now your password”. With dual cameras that are individually calibrated with new gyroscopes and accelerometers for motion tracking, this isn’t just about slicker photos but opens up the world of augmented reality (AR) to developers.

Following the announcements at Apple’s developer conference WWDC in the summer, we were brought ARKit and Core ML to open up the potential of incorporating augmented reality and machine learning into our apps. We noted the potential of dual cameras with interest at this point with the introduction of ARKit and now Apple are bringing us the hardware in iPhone X to realise the potential of blending the real and virtual worlds even more seamlessly. With the all-screen design, Apple are allowing us to “create games and apps offering fantastically immersive and fluid experiences that go far beyond the screen”.

An example of the AR capabilities in-play comes from Major League Baseball with a real-time overlay of player info on the players on the field, so a fan can watch the game through their phone and see the action in front of them, augmented by an overlay of player information. We are already excited to think of the ways we could blend viewing of action and consumption of data for sports fans.

Image Credit: Apple, Inc.

We are, of course, also childishly looking forward to expressing ourselves through animated monkeys and what more could a progressive, cutting edge agency want than to be able to make a poo smile‽ Well maybe this could lead to controlling more than just an emoji using this sophisticated technology and open up a whole host of new possibilities when it comes to app development.