Share This

Related Posts

Neural networks have already helped many to beautify their portrait snaps and other images with awesome beauty art filters. Now they are taking another bold leap by letting users add green screen background effects to videos captured using smartphone cameras.

The latest development comes courtesy of Google, who has been testing around the new feature for quite a while. Interestingly, the feature is being tested in YouTube Stories, which itself has been under beta for some time.

Similar to how filters are used, YouTube Stories lets users apply filters that are powered by neural networks to automatically detect the background in a video. The background can then be overlaid using the available backgrounds or effects to make the video appear like it has been made with green screen background.

Indeed, the effect doesn’t deliver result like it’s been shot with green screens. Edges sometime appear blurred out in some filters, like it would appear with some of the dual-cam portrait shots taken on modern day smartphones. But that should barely matter, given that the technology could make way even into the cheapest of devices. Besides, blurry effects won’t matter if the background is replaced with non-real effects or backgrounds.

And that’s not it. With the ability of the feature to distinguish background, Google is also trying out additional features to embellish the foreground contents in the video. That includes adding animation or cartoon effects to foreground alone etc.

The filters are currently being tested in YouTube series, which when becomes official will be touting the filters as its highlight. But given the wide scope of the neural networks, Google won’t be restricting the filters to YouTube, and would certainly want to take it out to other apps at a later point of time.