Hey, very cool but I experience a mismatch between the real world and virtual objects. When scanning a table with ARKit’s debugplane visible, it shows clearly how things don’t correspond. I’m not exactly sure what the problem is, could be a lot: misalignment of camera placement, wrong FOV of camera, too much “stereo effect” between the two virtual camera’s etc etc.
I know this is pretty difficult to solve, especially for all those different screen sizes, but still wonder if you have some tips or whether this is coming in future updates? @Maarten_AryzonTeam

One thing that might cause the issue is the ARGyroCamera script, did you turn this off? There might still be a little misalignment. And I noticed that if you change settings in ARKit from UnityARAlignmentGravityAndHeading to UnityARAlignmentGravity things might be in the wrong position as well.

The default settings are a bit off and we are fixing that which should not be too hard since most iPhones are the same.

Hello. I have successfully setup the Aryzon SDK in Unity for ARKit development, and I had a question regarding Aryzon’s capacity in stereoscopic mode. Specifically, Whenever I enable Aryzon mode and turn on stereoscopic mode ( turn phone to landscape ) the real world background does not render and everything is black. I noticed that both cameras in the Aryzon prefab have a preset black background. Messing around with that did not get me anywhere. How do I successfully allow for stereoscopic mode while also being able to see the real world background through the cameras(and not just a black background)?

Glad to hear you got the SDK working. The black background is expected behaviour for the SDK since it is made for the Aryzon headset. The headset works by overlaying the phone’s display image onto what you see of the real world by using lenses and mirrors. A black portion of the display will not show in your view which is what we want for the background because you already see the real world. In other words, you would not want to project the background in stereoscopic mode and place it in the headset because you will see the real world and the filmed world as well.

If you are using the SDK for something else you still have the issue of the filmed image being a single image. For depth rendering you will need a stereoscopic camera image as well.