A 3D artist could use a full ARKit modeling and painting platform like that to visualize just about anything. Architects could easily model extensions for pre-existing buildings with it, designers could dress virtual models in their latest outfits and so forth.

Another demo comes via developer Osama Abdel-Karim, who built his app using a combination of iOS 11’s ARKit and Vision frameworks to create the finger painting effect.

One of the cool libraries that Apple introduced in iOS 11 is Vision Framework. It provides some computer vision techniques in a pretty handy and efficient way. In particular, we are going to use the object tracking technique.

Object tracking works as follows: First, we provide it with an image and coordinates of a square within the image boundaries for the object we want to track. After that we call some function to initialize tracking. Finally, we feed in a new image in which the position of that object changed and the analysis result of the previous operation.

Given that, it will return for us the object’s new location.

Apple will most likely feature ARKit-driven apps in a special section on App Store.

The company has said that ARKit apps will require an iPhone or iPad with the A9, A10 or later processor, meaning owners of iPhone 6, iPad Air 2 and older devices will be left out in the cold.