Introducing the Unity ARKit Remote

When Apple announced at WWDC that ARKit framework was going to be part of iOS11, there was great excitement in the development community, as most people saw an amazing opportunity to create incredible AR experiences that could be felt by a large audience.

Unity had worked with Apple during the previous months to create the Unity ARKit Plugin, which was released on that first day of WWDC, and allowed any developer to use Unity as a content creation platform for ARKit apps. The demo videos of ARKit apps using that plugin started streaming out over the Internet.

Today, we announce a new feature of our Unity ARKit Plugin: Unity ARKit Remote. This allows developers to prototype their experiences in an agile manner, reducing their production timelines drastically. Previously, when a developer needed to iterate on the scripts and on editing objects, they would have to build out to an iOS device to test their changes. Unity ARKit Remote allows you to run a special app on the iOS device which feeds ARKit data back to the Unity Editor, allowing you to react to that data in realtime in the Editor.

18 コメント

This is an amazing development tool!
Unfortunately it doesn’t seem to recognize touch events on the phone. I tried to set up the UnityARHitTestExample to register mouse click events so that I could use it for testing, but I don’t seem to be getting any results back for:
List hitResults = UnityARSessionNativeInterface.GetARSessionNativeInterface ().HitTest (point, resultTypes);
// where (point = UnityEngine.XR.iOS.ARPoint) & (resultTypes = ARHitTestResultTypeHorizontalPlane)
When I run this using the remote and mouse clicks, hitResults always comes back empty, but if I run it on my phone without the remote using touch, I get a list back. Is there a way to get this to work?

Chris, one of the drawbacks of the ARKit Remote is that it does not do HitTest. But we have put in an EditorHitTest script component that you can add to any GameObject that you want to simulate HitTest against planes on the Editor side. The script can be found in the ARKitRemote folder.

beautiful. Thanks guys. I’m always impressed at how well you guys do at being in _our shoes_. I did a lot of Adobe stuff in the past, and they would never have any ideas how much pain was involved in using the tools. The resources you guys throw at making demo apps, and the film making stuff you guys do internally really pay off for us guys in the pit, imho : you develop tools/processes that actually fit the work we do. Keep it up!!! Brilliant stuff.

Hi Joe,
It’s possible this not working for you for some reason, but otherwise the large consensus seems to be that ARKit tracking works reliably and is pretty stable given most conditions. What are your observations that cause you to conclude otherwise?

I have this same issue in Windows. Is a square Unidentified Software Object [for short U.S.O.] with random square dots inside in a square of 100px X 100px that usually is draw printed on top of the screen, It happens randomly. It Remembers me the virus Commodore 64.

That square is the rendering of the checkered Cube gameobject that is present in the EditorTestScene, and shows that the Remote is working as expected. This is one of the basic operations of the Unity ARKit Plugin: being able to track virtual objects and their movement in relation to the camera in the real world.

It looks like the game is running on the device while the editor only updates state. Why not make it run on editor, and only receive ARKit sensor data, so that you can actually set breakpoints. This would also allow me to update the code on the run, without the need to build the whole app(Unity+xcode).

Actually, the game is running in the editor: you will need to add one gameobject to your game scene for it to connect from editor to device. The Unity ARKit Remote app on the device just provides ARKit data, and it will work just as you described. See the linked forum post and give it a try. Good luck!