The iPhone 8's 3D Laser Will Enable Stunning Augmented Reality

Tim Cook's vision is coming to life.

The iPhone 8, Apple’s flagship smartphone expected to launch this fall, has been heavily rumored for several months. A new report on Wednesday, though, revealed an extra feature that has slipped under the radar until now: a 3D laser scanner, perfect for enabling the jaw-dropping augmented reality experiences that CEO Tim Cook has described in poast interviews.

The report from Fast Company, citing sources, claims that a rear-facing vertical-cavity surface-emitting laser, VCSEL for short, will appear on the iPhone 8. It sounds complex, but it’s simply a laser system that’s capable of calculating the distance from objects by sending a beam and seeing how long it takes to bounce back to the phone. It’s relatively cheap, costing around $2 per phone, and it would help improve augmented reality greatly.

Apple’s forthcoming device is rumored to include a dual lens camera similar to the one found on the iPhone 7 Plus. This should help improve the phone’s depth perception, helpful for powering systems that place virtual objects in real-world camera feeds like Pokémon Go and Snapchat. Until now, though, the system has been largely employed as a way of creating DSLR-like shallow depth of field in pictures, which Apple refers to as “Portrait Mode.”

But Tim Cook is positive about the future of augmented reality. iOS 11, set to launch later this year, includes a new developer toolset called ARKit, which makes developing augmented reality apps simple. In the future, AR could enable a number of applications like a furniture ordering system that allows users to preview chairs in their home before they purchase them. A laser would help improve these applications dramatically.

“You’re going to see some consumer things that are unbelievably cool,” Cook said about AR in a June interview.

The original iPhone compared to the iPhone 8.

The technology also has big implications for the camera’s autofocus abilities. Since the iPhone 6, Apple’s smartphones have used a phase detection autofocus system that compares light coming into the lens from alternate sides. The information is used to understand whether the lens is in focus or if it needs to adjust. A laser, on the other hand, would bounce off objects in the shot and focus on the subject in a matter of milliseconds. The Google Pixel and the OnePlus 2 are just two phones on the market that use laser autofocus.

Supercut Shows 10 Years of Apple Execs Thanking Each Other On Stage

Unfortunately, it’s possible at this stage that Apple misses the deadline and the feature is pushed back to a 2018 iPhone. The company is struggling to get some of the iPhone 8’s more exotic features ready for launch, like an OLED screen and hidden fingerprint scanner, so it’s possible that Cook’s vision may have to wait another day.