YouTube Stories get AR-based filters

Highlights:

Google has a new feature for YouTube Stories.

It brings new AR features like filters for the video.

For example, you could wear virtual glasses in your selfie video.

Google has of late been focussing deeply on its augmented reality (AR) products like the new features coming Google Maps and Playground, an experimental mode in the Pixel camera app. Google is now going further with AR knowledge by introducing the Augmented Faces API for YouTube Stories. What this means is that you should now be able to add fun filters like 3D hats and masks while creating Stories on YouTube, similar to Snapchat.

Google has used machine learning (ML) to make the new feature available in YouTube Stories. “One of the key challenges in making these AR features possible is proper anchoring of the virtual content to the real world; a process that requires a unique set of perceptive technologies able to track the highly dynamic surface geometry across every smile, frown or smirk,” the company writes in its AI blog post.

Inference of the data received through the phone’s camera is done on the device using TensorFlow Lite. A newly developed GPU back-end acceleration boosts this performance whenever available and possible. In addition, Google has designed a variety of model architectures with different performance and efficiency characteristics to cover a wide range of consumer hardware. What it means is that Google has taken steps to ensure that the feature works across a wide range of phones.

One of the Creator Effects Google illustrates in its blog post involves adding virtual glasses to the face of YouTube Stories user. The GIF image in Google’s blog post shows a woman walking around in a selfie video. While she’s not actually wearing glasses, Google’s new Creator Effects for YouTube Stories adds virtual glasses to her video, making it seem like she is actually wearing glasses. Google introduced Stories back in late 2017 but has since been a part of YouTube’s UI on mobile.