Quickstart for Google VR SDK for iOS

This guide shows you how to use the Google VR SDK for iOS to
create your own Virtual Reality (VR) experiences.

You can use Google Cardboard to turn your smartphone into
a VR platform. Your phone can display 3D scenes with stereoscopic rendering, track
and react to head movements, and interact with apps by detecting when the user
presses the viewer button.

The Google VR SDK for iOS contains tools for spatial audio that go far beyond simple
left side/right side audio cues to offer 360 degrees of sound. You can also
control the tonal quality of the sound—for example, you can make a conversation
in a small spaceship sound drastically different than one in a large,
underground cave.

This tutorial uses "Treasure Hunt", a demo app that demonstrates the core
features of the Google VR SDK. In the game, users look
around a virtual world to find and collect objects. It shows you how to:

Implement a UIViewController to host GVRCardboardView

The TreasureHunt app implements a
UIViewController,
the TreasureHuntViewController class. This UIViewController class has an
instance of GVRCardboardView
class. An instance of the TreasureHuntRenderer class is created and set as a
GVRCardboardViewDelegate
for the GVRCardboardView. In addition, the app provides a
render loop, the TreasureHuntRenderLoop class, that drives the
-render
method of the GVRCardboardView.

Define a renderer to implement the GVRCardboardViewDelegate protocol

GVRCardboardView
provides a drawing surface for your rendering. It coordinates the drawing with
your rendering code through the
GVRCardboardViewDelegate
protocol. To achieve this, the TreasureHuntRenderer class implements
GVRCardboardViewDelegate:

Implement prepareDrawFrame

To set up rendering logic before the individual eyes are rendered, implement
-cardboardView:prepareDrawFrame:.
Any per-frame operations specific to this rendering should happen here. This is
a good place to update your model and clear the GL state for drawing. The app
computes the head orientation and updates the audio engine.

Implement drawEye

The
drawEye
delegate provides the core of the rendering code, similar to building a regular
OpenGL ES application.

The following snippet shows how to implement drawEye
to get the view transformation matrix for each eye and the perspective
transformation matrix. Note that this method gets called for each eye. If the
GVRCardboardView does not have VR mode enabled, then eye is set to the center
eye. This is useful for monoscopic rendering, to provide a non-VR view of the
3D scene.