Wave Unity SDK provides the integrated plugins for Unity contents. The Unity contents can
manipulate the poses of the head and controller by importing Wave Unity plugin’s scripts. And Wave Unity plugin’s
render script can easily turn the main camera to a stereo view camera for VR.

We assume that you already have the essential experience to develop an Android app using Unity IDE
and the knowledge of C# language.

Camera is the main camera. Its near/far value will affect both eyes’ near/far value. It helps the game view to display a monocular vision from the head’s position. But in play mode, its Culling Mask is Nothing, which means nothing will show on the display through this camera.

WaveVR_Render The main script for the render lifecycle. It will create both eyes and an ear. In play mode, it controls both eyes to render and display the binocular vision. The details will be provided later. All game objects of a scene should be ready before the controller is initializesd so set the render script execution order to -100(ms). You can set the Script Execution Order in Edit > Project Settings > Script Execution Order.

Binocular vision

WaveVR_PoseTracker It will receive the pose event and change the game object’s transform according to the tracking device. You can choose an index to decide which device will be tracked.

In play mode, the main camera will be expanded in the runtime. The following
components and game objects are created and added to the head game object.

Expanded cameras

Inspector of expanded head

You can also expand the cameras by clicking the expand button. After expanding,
the created game objects can be modified.

Expand Button

In the hierarchy, WaveVR game object take a position like a body or a
playground origin. You can place your head in a scene by moving
WaveVR. Do not change the transform of the head because it will be
overwritten with the HMD pose.

This is the component added in the Eye Center after expanding.

Physics_Raycaster (Unity original script)

These are the game objects added as children of the head after
expanding:

Eye Center

Eye Both

Eye Right

Eye Left

Distortion

Ear

Loading

Both eyes which is represented by “Eye Right and Left” will be based on the IPD to adjust their position to the left or right. Therefore, each eye will see from a different place. The “Eye Both” will be in the center position of two eyes, and just set different matrices into the shader for each eye when rendering. Every eye’s camrea position will be set in runtime according to the device. Thus you don’t have to modify the eye transform yourself. In editor, we give a preset position to the transforms.

If the GameObject of Wave_Render has a Camera, it will be copied to Eye Center, and then be disabled.

Each eye has a camera. Its near and far clip planes’ values will be set
according to the values of the Eye Center’s camera. And its projection and field
of view are controlled by a projection matrix that is taken from the SDK. The plug-in will set
a target texture when rendering and the default viewport should be full texture.
The other values you can set to the camera are: clear flag, background, culling

mask, clipping plane, allow MSAA, occlusion. All these values was copied from the main camera when

creating this camer during expanding. After first set, WaveVR will not modify them again. We dont’ support Dynamic Resolution and support only the Forward rendering path.

Distortion distorts both well-rendered eye textures and presents them to the display.
This only works in the Unity Editor Play Mode for preview. It will be disabled when a project had been built as an app.
The WaveVR compositor, which only works on target device, will be used instead.

Ear has a audio listener.

Loading is a mask for blocking other camera’s output on screen before WaveVR’s graphic is initialized. Loading will be disabled as soon as the WaveVR’s graphic initialized.

The VR device that can track a user’s rotation and position is a 6DoF device while a VR device that can only track a user’s rotation is a 3DoF device.

WaveVR supports both 3DoF and 6DoF. However, apps for 6DoF will not have the same design as apps for 3DoF. A dynamic switch may not be easy to do.

If a developer wants to only support 3DoF, there is an option in WaveVR_PoseTracker. Developer should uncheck all the track position in each WaveVR_PoseTracker component and choose Tracking Space as TrackingUniverseSeated in WaveVR_Render. This makes sure that the head or controller position will be fixed to where you want it to be.

The PoseTracker

choose a tracking space

If a developer wants to support 6DoF, choose Model_OriginOnHead or Model_OriginOnGround for your application.