When using the face data, how can i know the absolute rotation of the head? what I get by using UnityARMatrixOps.GetRotation(anchorData.transform) is the rotation of the head - relative to the position of the phone. If I move the phone, it affect the rotation.
My use case is that I have a virtual face, and i want to move the face based on the user's face movements.
I dont know if my question is clear enough... hope it is

Is it possible to place an object on a plane that faces toward the camera when placed?

Basically, I want to use an AR HitTest to place an object on a detected plane but instead of having it face forward according to initialization, face toward the camera when the touch for HitTest is registered on the screen.

Unity Technologies

When using the face data, how can i know the absolute rotation of the head? what I get by using UnityARMatrixOps.GetRotation(anchorData.transform) is the rotation of the head - relative to the position of the phone. If I move the phone, it affect the rotation.
My use case is that I have a virtual face, and i want to move the face based on the user's face movements.
I dont know if my question is clear enough... hope it is

Click to expand...

That's what FaceAnchor does - see FaceAnchorScene example - instead of the AxesPrefab, put your virtual face there and reference it from ARFaceAnchorManager

Is it possible to place an object on a plane that faces toward the camera when placed?

Basically, I want to use an AR HitTest to place an object on a detected plane but instead of having it face forward according to initialization, face toward the camera when the touch for HitTest is registered on the screen.

Does anyone know how to get access to the ARCamera struct (ARCamera.cs in plugins/iOS/UnityARKit/NativeInterface)? It is used in ARFrame.cs, but I can't see either being used in UnityARSessionNativeInterface.cs at all.

I want to use these 3 vectors in ARCamera.cs to calibrate my camera matrix for marker recognition in OpenCV/arucodetection:
public Vector3 intrinsics_row1;
public Vector3 intrinsics_row2;
public Vector3 intrinsics_row3;

@jimmya Nothing here or anywhere mentions how to get the actual centre of a plane anchor.

When using the planeAnchor.centre vector, and using that to place my object, instead of appearing at the centre of the plane it appears at the corner (not even on the plane, just off it). It is the same each time so it thinks this is the centre.

So how does one use the planeAnchor data to get the actual centre in world space of the plane? It would be nice to be able to loop through tracked planes, and use their actual world space centre to position an object, and its extent to determine the bounds so it can move around on the surface.

I feel like working this out is the last thing stopping me from really using this properly. it would be great to be able to work out actual world space bounds + positions so I can make things walk around on surfaces.

Any help will be greatly appreciated!

EDIT: or should i be using the plane anchor game object rather than plane anchor to position object? If so are you getting the bounds of that simply by checking a collider or is there some XR.iOS function other than the matrix ops needed to get the bounds of a AR anchor bound gameobject?

@Daemonhahn not sure if it's any help but to move characters around on generated surfaces I've been using Unity's run time nav mesh components (on GitHub), works pretty well so long as the area is big enough.

@Daemonhahn not sure if it's any help but to move characters around on generated surfaces I've been using Unity's run time nav mesh components (on GitHub), works pretty well so long as the area is big enough.

Click to expand...

Thanks @MattMurphy , unfortunately I am fine with moving stuff in unity and am actually a proffessional VR/AR developer in my dayjob, its just the actual conversion between AR "space" and world space that seems to be going a bit wrong.

Either way, I found out 4 different ways to do this (and not to do it) and posted my findings here for anyone that needs them:

Hi Nedja,
We don't actually send those parameters over from ARKit into our plugin currently. If you really need it. we can try and set it up for you. But this is really a parameter that is used to figure out the camera projection matrix for the virtual camera to match it up with real camera on device.

Click to expand...

Are the camera intrinsics still not exposed? ARKit has a way to get them- can it please be added to the plugin?

Unity Technologies

Thanks @MattMurphy , unfortunately I am fine with moving stuff in unity and am actually a proffessional VR/AR developer in my dayjob, its just the actual conversion between AR "space" and world space that seems to be going a bit wrong.

Either way, I found out 4 different ways to do this (and not to do it) and posted my findings here for anyone that needs them:

Hi @jimmya
Thanks for replying during the weekend
That error does say I'm missing UnityEngine.XR.iOS
I'm using Unity 5.6.1p1
How do I install UnityEngine.XR.iOS
Can't express enough appreciation for all of your help

Current Versions:
- Unity 2017.3.1p1
- UnityARKitPlugin 1.0.14
- Any supported devices
- Any OS from iOS 11.0 to 11.2.x so far
* Note that it happened with previous versions of Unity and ARKit plugin as well.

At a very low repro rate, the UnityARCamera get stuck on ARTrackingStateLimited + ARTrackingStateReasonInitializing.
When that happens, it breaks ARKit for every apps on that device. They all become unable to start tracking and the only way to fix it is to restart the device.

I didn't notice any specific pattern yet; It's not happening after anything unusual like a crash or errors.

Unity Technologies

Current Versions:
- Unity 2017.3.1p1
- UnityARKitPlugin 1.0.14
- Any supported devices
- Any OS from iOS 11.0 to 11.2.x so far
* Note that it happened with previous versions of Unity and ARKit plugin as well.

At a very low repro rate, the UnityARCamera get stuck on ARTrackingStateLimited + ARTrackingStateReasonInitializing.
When that happens, it breaks ARKit for every apps on that device. They all become unable to start tracking and the only way to fix it is to restart the device.

I didn't notice any specific pattern yet; It's not happening after anything unusual like a crash or errors.

Does anyone else have seen this issue?

Thanks.

Click to expand...

This can happen due to a variety of reasons, and you can detect using OnTrackingChangeEvent - if it persists for a while (10s?), you should reset your session.

I would like to get the ARPlaneAnchorGameObject of the plane I am currently looking at, but somehow the identifier of my hitResult doesn't seem to match up with any of the identifiers of the planes I have already detected. Is there an easy way to get the ARPlaneAnchorGameObject from one of the new irregular shaped planes through a hitTest? My current solution is just to find the closest anchor to the hitpoint, but I can think of plenty of situations where this wouldn't yield the result that I want. Thanks in advance for any ideas!

I found a problem of the unity arkit sdk recently. It seems in the newer ios version since ios 11.2, but I didn't test it in every ios version. The test results are as following.
1. I build an arkit app by using unity, it can run well.
2. But when I connect my iphone to the monitor with hdmi and start the arkit app, It will crash directly.
3. When I open the app in advance or put it in the background, and then connect it to monitor, it won't crash.
4. The arkit app by using the ios native arkit doesn't have this problem.
How to solve this problem?
Any advice will be appreciated. Thank you very much!

I'm seeing some problems with the texture I get from ArKit as compared to Unity's Default WebCamTexture. ArKit's texture appear to be little blurry and less sharp. If you watch the two images below carefully you'll notice the difference. I'm using the `UnityARVideo` script to render the webcam texture for ArKit without any changes to the script. So no changes there.

Attached Files:

I am relatively new to this AR stuff, quite familair with unity but have been struggling with something. There are all these awesome tutorials on plane detection and placing objects on planes etc. I was wondering, in terms of "anchors", is it possible to "create" and anchor when the player touches the screen at that position. For example, the players phone is looking at anything that is not horizontal or vertical (if they were, plane detection would be easiest), and when they tap on a real world object, an anchor is produced at that position and a gameobject is created at that anchor.

If this is possible, how do i go about creating the anchor and placing a gameobject at that anchor? I have found the available resources not descriptive or detailed enough.

I get this error every frame when using the remote. My iPhone X's camera will still enable in the app, but the editor doesn't seem to recognize it at all (camera pos/rot is not updated and game view doesn't show my phone camera perspective, just a green screen)

It's been persistent for months, stopping me from doing anything through the ARKIT remote app at all and slowing down dev a lot. I tried rebuilding the remote scene, making sure my Unity editor version matched the version I originally built on, tried switching between XCode 9.2 stable and XCode 9.3 beta, tried both old and completely new scenes for the remote connection, made sure my build settings were set to debug/development build. I'm kind of out of ideas at this point.

I'm using Unity 2017.3.0f3, but this problem has persisted from 2017.2 when I first began the project. My iPhone is on the 11.3 beta and is up-to-date. I also made sure I was a trusted developer and that the remote app has camera permissions on my phone. Any thoughts? I've seen others have this issue and tried some fixes (my list of stuff above), but it doesn't seem like they work consistently for everyone. I'm not sure if Unity's remote app is just very unstable or if I'm doing something wrong.

UPDATE: I've tried multiple macs on High Sierra and Sierra but that hasn't made a difference either

Unity Technologies

I'm seeing some problems with the texture I get from ArKit as compared to Unity's Default WebCamTexture. ArKit's texture appear to be little blurry and less sharp. If you watch the two images below carefully you'll notice the difference. I'm using the `UnityARVideo` script to render the webcam texture for ArKit without any changes to the script. So no changes there.

Click to expand...

This is the default resolution of the image used by ARKit, which might be lower than the camera resolution. In ARKit 1.5, they have increased the resolution by default, and allow the developer to change to other video formats if available.

I get this error every frame when using the remote. My iPhone X's camera will still enable in the app, but the editor doesn't seem to recognize it at all (camera pos/rot is not updated and game view doesn't show my phone camera perspective, just a green screen)

It's been persistent for months, stopping me from doing anything through the ARKIT remote app at all and slowing down dev a lot. I tried rebuilding the remote scene, making sure my Unity editor version matched the version I originally built on, tried switching between XCode 9.2 stable and XCode 9.3 beta, tried both old and completely new scenes for the remote connection, made sure my build settings were set to debug/development build. I'm kind of out of ideas at this point.

I'm using Unity 2017.3.0f3, but this problem has persisted from 2017.2 when I first began the project. My iPhone is on the 11.3 beta and is up-to-date. I also made sure I was a trusted developer and that the remote app has camera permissions on my phone. Any thoughts? I've seen others have this issue and tried some fixes (my list of stuff above), but it doesn't seem like they work consistently for everyone. I'm not sure if Unity's remote app is just very unstable or if I'm doing something wrong.

UPDATE: I've tried multiple macs on High Sierra and Sierra but that hasn't made a difference either

Click to expand...

The "SerializationException" means that the format of the data you're serializing on end is not being recognized on the other end. (the ends being editor and remote device). Most likely because you're using a version of the remote app built with a different version of Unity than the one you have for the Editor.

Unity Technologies

The "SerializationException" means that the format of the data you're serializing on end is not being recognized on the other end. (the ends being editor and remote device). Most likely because you're using a version of the remote app built with a different version of Unity than the one you have for the Editor. Build the remote app to your device using the version of Unity Editor you want to iterate with.

I have a question re: Unity/ARKit as someone who is not familiar w/ iOS/Swift dev at all.

I need a bluetooth peripheral to work with my ARKit experience, and I want to build the experience in Unity. The bluetooth device has both a Unity SDK & an iOS SDK, but, ya know, they're different... If I import the bluetooth peripheral's Unity SDK into the Unity project, along with the ARKit plugin, can I expect the peripheral to work as expected once exported to the Xcode project & then built to the device? Part of me feels like this is a fool's hope, but I'm going to be trying in a few days. Any tips would be helpful! Thanks, all.

Has anyone found a way to update a pre 1.5 project to the new SDK? I am having a tough time. I've tried deleting the old plugin, re-arranging files on the plugin folder with the previous project structure but I keep getting console errors. I'd hate having to redo the whole project just to implement 1.5 or waiting for the asset to be updated on the app store since Apple is pressing us to update before WWDC

Unity Technologies

Has anyone found a way to update a pre 1.5 project to the new SDK? I am having a tough time. I've tried deleting the old plugin, re-arranging files on the plugin folder with the previous project structure but I keep getting console errors. I'd hate having to redo the whole project just to implement 1.5 or waiting for the asset to be updated on the app store since Apple is pressing us to update before WWDC

Click to expand...

It should update pretty cleanly - seems like you might be doing something wrong. Are you writing your code within the plugin folders? You should probably not do that.

It should update pretty cleanly - seems like you might be doing something wrong. Are you writing your code within the plugin folders? You should probably not do that.

Click to expand...

Hey jimmya, you mean I should just drop the UnityArkitPlugin folder into the Assets folder? I was dropping the folder within Plugins/iOS/

If I place the bitbucket code into the assets folder (previously deleted the old version of the UnityArkitPlugin folder) I get the following console errors. Still better than the "asset"notFound errors I was getting previously

This is the default resolution of the image used by ARKit, which might be lower than the camera resolution. In ARKit 1.5, they have increased the resolution by default, and allow the developer to change to other video formats if available.

Click to expand...

Thanks Jimmya for your response. I tried ArKit 1.5 with the maximum resolution possible but it still appears a little blurry as compared to the camera app on the phone. Seems like ArKit adds some kind of noise/processing to the WebcamTexture. I know this is not something Unity has control on but I'd appreciate if you have any kind of information on what processing is being applied so that I can try adding a post processing method to remove arkit's processing or if there is a way to get the raw webcamTexture when I click a button. Basically I need the texture for some Image processing experiments that we are working on in our company but our networks are getting confused with ArKit textures because of the blurriness.

Attached Files:

I have a question re: Unity/ARKit as someone who is not familiar w/ iOS/Swift dev at all.

I need a bluetooth peripheral to work with my ARKit experience, and I want to build the experience in Unity. The bluetooth device has both a Unity SDK & an iOS SDK, but, ya know, they're different... If I import the bluetooth peripheral's Unity SDK into the Unity project, along with the ARKit plugin, can I expect the peripheral to work as expected once exported to the Xcode project & then built to the device? Part of me feels like this is a fool's hope, but I'm going to be trying in a few days. Any tips would be helpful! Thanks, all.

Click to expand...

You want to make sure that the Unity SDK for the bluetooth device supports iOS. Beyond that, no clue here, sorry.

I feel very very stupid, but I can't any of the demos to work. Where am I going wrong?

I downloaded the plugin via the repository after the version on the Asset Store was giving me some message spam. However, to give an example - I tried to build the AddRemoveAnchorExample (if there is supposed to be a central first scene I am unclear on which one it should be?). Builds to XCode fine. Run via debugger to iPhone 6s, and I just see a black screen with what appears to be the word "MAKE" in tiny letters in the upper right(!?). Debug log:

Unity Technologies

I feel very very stupid, but I can't any of the demos to work. Where am I going wrong?

I downloaded the plugin via the repository after the version on the Asset Store was giving me some message spam. However, to give an example - I tried to build the AddRemoveAnchorExample (if there is supposed to be a central first scene I am unclear on which one it should be?). Builds to XCode fine. Run via debugger to iPhone 6s, and I just see a black screen with what appears to be the word "MAKE" in tiny letters in the upper right(!?). Debug log:

I am using unity2017.3; Vuforia for devices without ARKit, but Unity's ARKit plugin otherwise. My code works with Vuforia but I can't make it work with Unity's plugin (first image is desired effect: I see the assets underground, 2nd is with arkit, I can't see assets underground).
I have 2 cameras in my scene, one for the video only and one for the effect. I am trying to render an effect with 2 planes on top of the video feed. The video camera has a depth of -1 and the main camera 1 (this is the same for vuforia and unity's plugin).

Here is how I render the effect, which I call in an Update():

public void RenderVideoClipping(Texture videoTexture)
{
//Layers definition:
//8 - Over
//9 - Under
//10- Over Under
//11- Clip
//12- Over Clip
//13- Under Clip
//14- Over Under Clip

videoMaterial.mainTexture = videoTexture;

/****************Render the clip geometries to the clip texture*************/

/******Render the clipped video***********/
//This is done by rendering the clip geometry transparently but with depth.
//Then the video plane is rendered at 1000m in the background.
//The areas where the transparent clip geometry was rendered will prevent the video from rendering because of the z depth test.

And here is how I get the videoTexture with the arkit plugin ARKit:
I simply slightly modified the UnityARVideo behaviour by adding a CommandBuffer to blit into a RenderTexture, and I pass that render texture to RenderVideoClipping() above.

Hi, I ran into a "small" problem. It's somewhat fixeable by making a lot of itterations till I get it correct but that's not how it's supposed to work. When I use the ImageAnchor from the 1.5 version my prebab spawn in a weird angle. Sometimes it spawns right but that takes a lot of updates, also 180 degrees in a Unity project seems to be around 30~45 degrees in real life once the app is build? It's really weird. And both Unity and ARkit use 1 unit for 1 meter but I'm trying to map a building of 20x20m over the real life version, setting the scale to 20 is still too small, even when my real life tracker is bigger than the given physical size in the build? Anyone else who has some experience with this?

Oops...

"Unity", Unity logos, and other Unity trademarks are trademarks or registered trademarks of Unity Technologies or its affiliates in the U.S. and elsewhere (more info here). Other names or brands are trademarks of their respective owners.