This entry will describe our efforts to get 360 video working in Unity for apps running on Gear VR and Cardboard (both iOS and android). This is more of a work in progress than a full guide, but I hope it helps. Feel free to leave questions or suggestions!

The overall process is:

Start with google cardboard camera or oculus camera demo scene.

Add a sphere with an equirectangular UV mapping and inward facing normals around the camera.

Purchase a plugin to play a movie on that sphere’s texture. (Note: if you just want to run on Gear VR, you can adapt their movie example to play on the inside of a sphere. I’m not sure how to extract their movie playing code to a cardboard android app, though).

Use mp4s or ogg vorbis files that are compatible with the platform. Some resolutions and frame rates don’t work for me.

Movie & Video Plugins for Unity and Android / iOS / Gear VR.

The most success so far has been achieved using the Easy Movie Texture plugin for Unity, currently $45. It works on android (utilizes Android MediaPlayer) and iOS and the developer gives great support. It even supports streaming and playing from the SD card. Note that it doesn’t work in the Editor, so you have to build to see the results.

The best part of this plugin is that it comes with a demo scene including a sphere to play back equirectangular videos. Getting a suitable sphere is a little tricky, since the default sphere in Unity has normals facing outwards rather than in, and doesn’t have the right UV mapping.

I also tried prime31’s iOS video plugin, LiveTexture, which was $75. This one was nice because videos would play inside the Editor, but I had to provide my own sphere and it doesn’t work on android. AND, sounds don’t play. The plugin provides a way to sync an audio track along with the video, but this seems likely to get out of sync. I had trouble with larger format videos being very choppy. A 2048×1024 was unwatchable on iPhone 5S, which is my lower end target for playing back 360 video with cardboard. Anything below 1000px of vertical resolution starts too look poor under a magnified 360 view.

Another one that’s also $75, but free to try, is Mobile Movie Texture. This one requires ogg vorbis files, and also doesn’t play video at the same time. It does support android and iOS.

One other plugin I tried plays back a sequence of images: Universal Video Texture ($20). This also doesn’t sync audio, but has the potential for extreme hi-def playback at the cost of extreme app size–not ideal for a mobile app. I believe a lot of the compression in an mp4 or ogg vorbis file is due to the compression algorithm knowing that a certain pixel doesn’t change from one frame to the next.

360 Equirectangular Sphere for VR Movie Playback in Unity

If you’d like a sphere to use with LiveTexture or one of the image sequence players, check out the roundscreen.obj (has holes at poles) here or an inside facing normals sphere with 32×16 vertices I made in blender. I’m not sure how the number of vertices impacts the resulting quality vs performance. Lower poly spheres seem to look very bad near the poles. Here are the instructions I followed to create the UV wrapped sphere if you’d like to try out much higher vertex counts. If you find a sphere with the normals facing out rather than in, you can run this reverse normals script on the object.

(With LiveTexture, I also had some strange wrapping issues with non-powers-of-2 video resolutions (ie, not 1024,2048, etc). I got around those by commenting out the line including “updateMaterialUVScaleForTexture” in the plugin code.)

In addition to reversing the normals, you may also need to mirror the UV map–ie, it might playback flipped. The Easy Movie Texture plugin flips the UV y coordinates (on iPhone only?) by iterating through all the UVs and reversing the y:

vec2UVs[i] = new Vector2(vec2UVs[i].x, 1.0f -vec2UVs[i].y);

Video Resolution, File Location, Frame Rate and Other Considerations for VR on iOS and Android

I was able to play 4096×2048 mp4s on Android, however files with a 50 fps frame rate stalled repeatedly on a Galaxy Note 4, whereas 25 fps files played perfectly (Using Easy Movie Texture plugin). Reducing the bitrate in half or the resolution had no effect on playability, only the frame rate mattered. I’m still trying to wrap my head around what frame rate I see given that the Unity app is running at one frame rate and the video at another.

One tip for experimenting with different files is to play off the SD card on android. This way you don’t have to import the videos into Unity. Placing an mp4 into the Streaming Assets folder in a Unity project causes unity to try to import it to a ogg vorbis file, which can take hours for videos over a few minutes, and is completely unnecessary since the plugin plays the original mp4. You can also make the extension a bogus name like .mp42 to prevent unity from converting the video. This worked for me on Android, but not iOS.

To play a file from the SD card, simply give the Str File Name variable in Easy Movie Texture the following full path:

file:///storage/extSdCard/myvideo.mp4

Note that the path to the sdcard may be different on different phones. The above is only for the Galaxy Note 4. You can find out your path with the following command:

adb shell 'echo ${SECONDARY_STORAGE%%:*}'

Note on iOS, be sure to run one of the patches depending on your version of Unity inside the EasyMovieTexture folder: Unity5_Patch_IOS or Unity463_Patch_IOS.

Another big restriction on iOS, at least on an iPhone 5s is that the max file size is 1920×1080. So for 2×1 equirectangular movies, this is 1920×960, almost half the size of the android. I believe the iPhone 6 can play larger files, but have not tested yet.

Open Questions

What is the limit of playback fps and resolution on both iOS and Android? I know Gear VR supports 4096×2048 at 60fps, but is this possible from within Unity?

Thanks a lot for this post! This is something that I’ve been researching for a long time, though I’ve not come as far as you.
I’ve backed the Sphericam 2 and am anxiously awaiting it (hopefully this december) and want to be as ready to produce a complete experience as possible for the Gear VR. Mainly I expect to need a UI on top of the video and maybe some dynamic elements and it seems that Unity would be the optimal solution, though I’m at a loss of why 360 video haven’t been addressed in the mobile sdk examples… Their video example is a joke. Your post is therefore perfect timing!

Do you know of any good communities for Gear VR 360 video? Reddit is only 20% developers and the oculus forum is not so interested in video it seems.

I’m also very interested in WebVR, but I think it has some way to go. To my surprise there’s actually a lot of good examples for WebVR video. It’s just too unstable and the GearVR doesn’t have a way to view them

There sphericam 2 looks very exciting. I was able to use their video example as a base for getting video to work, but I moved on to other Unity plugins to support iOS and android cardboard.

I’ve been referencing a mix of the Unity forums, stackoverflow, reddit, and oculus forums 🙂

I think webVR is promising for cardboard. On Gear VR, if you don’t have high resolution and fast response, it really shows. krpano has a video option in their latest release, too, but that’s obviously still based in a web browser, so I don’t know how you’d view it in Gear VR.

Hi, can You tell me what unity you are using and what is version of oculus mobile sdk.
When i try do this and play my app on samsung s6 i got message that application must be close. ;]
Thanks in advence

4.6.4 for Unity (later versions distort audio on Gear VR for us) and 0.6.0.1. The message you got probably means you didn’t place the correct signature file (oculussig_xxxxxxxx) in the assets/Plugins/Android/assets/ for your specific phone. https://developer.oculus.com/osig/

Some really good insight here. Tried a few free plugins without success then decided I’d risk it and purchase the recommended one on this article (Easy Movie Texture) and got it running straight away. Thank you!

I don’t know why but in my octacore every video in 360vr with your method is played at 0.5-1 fps (yes, that’s between 1 and 2 seconds each photograma), but the apps I tried on google play let me see that videos fluidely. I’m talkin a bunch of videos at 30 fps (also tried 60fps with same result) and resolutions between 2000×2000 and 4000×4000

Without knowing a lot more about your setup (Unity ver, phone, video encoding, etc), it’s hard to say. I do think playing back Videos within Unity will be more limiting than an app designed specifically for playing videos. Thanks for your feedback… we’re still trying to research what’s possible.

First, thanks for writing this up! I’m still not up and running, so I’ll defer fist-bumps until I actually see spherical video running on my iPhone, but still, yesterday I despaired of this working at all, and now it seems there’s a chance it will work on iOS/Android.

I’ve had previous success with Oculus running my spherical videos, so I’m reasonably versant in the basics, so I’m just trying to hack around the limitations doing the same on Cardboard (iOS/Android).

I downloaded Easy Movie Texture, but the documentation is Android specific. Could you provide a concise point by point of your spherical video set-up using Easy Movie Texture? I don’t need a lot of detail, just high level steps, if you have a couple of minutes.

OK, so kind of have it running on my iPhone 6+. I made a prefab of the video sphere in the video sphere demo scene, then dropped that into the hierarchy of the scene I’m actually developing. I then figured out that my video needs to be a string reference on str_file_name field of the video sphere’s Media Player Ctrl script, and that video needs to be an mp4 file (not OGG) in the Streaming Assets directory. But, I still just see a white sphere when I run the app using one my own videos. As soon as I switch back to the example video I’m fine. I’m thinking the issue is with the dimensions of my video, which is a 360 video, not a rectilinear video as is the example video. The 360 video I’m trying to run is 3906 x 1952.

Sorry, but I’m just going to keep leaving questions here, since you seem to be the only person on the internet who has this working and is not bound up in NDA restrictions.

I just tried using a standard 1280×720 GoPro mp4, and I still wound-up with a white (blank) sphere. Again, if I run the example video included in the plug-in I see that video. The plug-in’s author suggested my resolution is too high, but that seems unlikely since it’s just video from a “white” GoPro 3.

I think I got up to 1920×1080, so your 1280×720 should work. I definitely had to stay within the 1920×1080 for iPhone 5s. I would think you could play just about anything with the 6+. Do you see any errors in the XCode log? I had some issues with frame rate, but only on android, and the video still played, it was just choppy.

Another thing to try is to sync the movie file with iTunes to make sure it will play outside of a Unity app.

Thanks! I tried reducing resolution to 480p with a Now I’m wondering if I’m just missing something in my usage of the plugin. Did you need to do anything other than drop a MP4 into Steaming Assets and creating a reference to on the video sphere script? I see the Manager prefab, but don’t know what it’s for.

Thanks foundry45, I finally got this working. I can confirm for any other developer looking at this, that the Easy Movie Texture plug-in totally allows you to use a movie as a texture in Unity and successfully publish to an iOS device. I don’t yet have an Android phone, so I can’t personally attest to it working on Android, but that seems like the plug-in developer’s target platform, so I’m sure it does.

The steps I followed to get this working were:

– Reduce bit size of original 360-video by exporting from Quicktime (Mac): File -> Export -> 720p
– Change extension of the resultant file from ‘mov’ to ‘mp4’
– Ensure the string reference to the file, ‘Str_File_Name’ field on the Media Player Ctrl script component, includes the extension. So, “my720pMovie.mp4” is input into this field as “my720pMovie.mp4”, not simply “my720pMovie”. Doh.

Thanks again foundry45, for doing the write-up on this project saving plug-in. I know there are a lot of unanswered Stackoverflow questions on this subject, so I’m going to go earn a lot of points right now by finding them all and pasting this answer in 😉
With full credit to foundry45, of course.

I’m still dropping frames at 720p! 480p was smooth as butter, but the image is pretty muddy. I just added a simple terrain to the scene and I think I lost even a few more FPS. I’m tempted to go with the lower image quality/higher frame rate.

Then again, it was just a quick and dirty 360 video stitch. I should definitely go back and optimize my stitch before lamenting to much about dropped frames.

@ Scott
Hi
As it has been a while since this article has been published, do you still recommend Easy Movie Texture plugin or is there any new or free plugin for playing 360 videos in Unity. And does Easy Movie Texture work with free version of Unity or requires Unity Pro?

I definitely recommend Easy Movie Texture, it is easy to use and it works just as we want it. Unfortunately, since the latest update two weeks ago, the price of this plugin is now $55 (instead of $45). But if you need it, buy it, you won’t be disappointed.

Pro version IS NOT required, EMV works with Personal version of Unity (free).

Man, I been fighting with lots of issues using unity + mobile movie texture, I used a OGV video in 1000 x 500 px for the iPhone 5s at 30 fps, the playback was smooth, this plugin does not allow use of audio so you have to synchronize it, the latest version (2,1,2) does not work, always give a Mach-O linker error despite the architecture you are setting in (arm7, arm64 or universal), or if IL2CPP or mono, that’s because “libtheorawrapper” seems to not have a right definition for arm7 (I’m suppossing, not an expert) so you have to roll back to a previous plugin in order to be able to compile in xcode, for the cardboard experience I use the Dive plugin. there is something that is driving me nuts, the same set up, the same resolution video in a supposed better hardware like the iPhone 6 but instead of a better run and experience is the opposite, the video is laggy, sometimes it completely stops and skips to another frame rendering it unplayable, unusable in iPhone 6, for a not so bad playback I have to dramatically reduce the resolution and kbps, the video hardly flows and the image quality is horrendous. for now i’m restricted to iPhone 5s. Mobile Movie Texture works in Unity free

Thanks for sharing, we use your method and it worked fine for us we are now running in an strange issue. When viewing our Video through the Gear horizontal straight lines look a little wavy distorted. Like something was wrong with the mapping of the sphere. This issue is not visible when viewed through the Oculus Rift in Unity. It is only in the Gear VR. That leaves us guessing. Any idea what that might be?

Has anybody tried to make 3D 360° videos work with Easy Movie Texture? My guess would be that one could have 2 separate videos (L, R) for each eye, then make 2 separate cameras (like the Oculus prefab) and 2 separate spheres and render each sphere only on one of the cameras.

First of all, amazing post, thanks a lot for posting this. Helps a lot.
I’ve read every comment in the discussion, and while I’ve already have my client’s app published, I’m facing a new challenge:

Allow the user to drag-rotate the sphere.

Now, I know this has nothing to do with any plugin, and actually I’m capable of rotating a sphere using my finger (Unity 5.2.1, iOS, Android) in a normal environment, but how about:

Rotating it while the camera is inside the sphere?
1. Should I face inwards the normals then instantiate the mesh collider?
2.Should I calculate the Raycast target and Update towards my finger movement?

I’m wondering why I can’t find anything related since Fb and Youtube have this feature already…

On touch/hold event, you can parent the sphere to the camera object. On touch release, un-parent it. As long as the sphere is located at the same transform position as the camera object, this should yield a drag/rotate effect.

This is how I would implement drag/rotate, but I recommend that you do not implement it. It can make your user sick quite easily. On touch/hold, the current viewport would lock to the user’s face and follow wherever the head moves. It’s okay on Youtube because there’s no head tracking. If you’re moving the sphere with your face, then it will look as though head tracking has broken during the drag… very dizzying.

Instead, there are a couple of alternatives:
A) Swipe left/right to yaw the sphere (recommended). Swipe up/down to pitch it (not recommended). This will still be slightly disorienting, but at least head tracking will not seem broken.
B) Snap reorientation… this is a feature built into the OS on GearVR. If you want to implement reorientation manually so that it works similarly inside your app on all deployment platforms, you’ll want to slerp the sphere to the camera’s current rotation. I recommend doing this for yaw only. Pitching or rolling feels like someone is tipping the world on its side. I also recommend having a UI appear with reorientation as a visible option. The user will be caught off guard by it if the UI doesn’t warn them what’s about to happen.

I am experiencing audio/video desynchronization when using the FB360 Spatial Workstation .tbe files. I’ve added the requisite sync script, but there is a black screen for about 0.3 sec when the video file, that seems to correspond to the offset in the audio.

Do you know why this would be happening? Is there anything I can do to adjust the playback of the video on the sphere so the audio and video are in perfect sync?

What’s your sync code? I did one project with TBE and didn’t run into sync issues, although now that I think about it, the audio didn’t have anything obvious to line up to in the video. What device are you playing back on? Tried different platforms? the TBE support team is great.

The one thing I should mention is that I’m trying to get sync in the editor/preview window in Unity, not building to a device (at least not for now). I am placing the sync script above on the sphere I am using, which also has the TB Spat Decoder AND the media Player Ctrl script on it. Is it possible that the scripts need to be attached somewhere else? If you can tell me where you attached your various scripts I can at least rule that out. Thanks!

Just bumping this last post – can you tell me where the scripts need to be attached for this to work correctly? I have the sync script, TB Spat Decoder and Media Player Ctrl scripts all attached to the sphere. Is this correct? Thanks!

I have been playing with this and your tutorial. I have downloaded the sphere from the link on reddit you posted but when I build the project onto my android phone I only listen to the sound of the video and can’t see a thing, just the sphere in white. (I’m using the easy movie texture plugin from the asset store). I also placed the camera outside of the sphere in case it was being reproduced on the outer face but also not :/

Hi Ruben, were you able to get the pre-built sphere scene + included video to build and run on your phone? A white sphere probably means the video isn’t playing, which could be an unsupported video format, or it’s not being triggered to run correctly. We’ve had lots of problems where we load a video and then try to play immediately afterwords, but nothing happens. We add a pause in between and it works :/

Hi , I’ve created a stereo 360 unity video player for Gear VR using culling mask and 2 spheres with 0.5 offset for 1 video with right and left views. I’m using the plug in Easy Movie Texture. It’s not working for 4k videos on my Samsung s6, this way it only plays video on one sphere and on the other looks black. So I had to use 2 videos 2048×1024 for each sphere but it plays asynchronously, does anyone had the change to do something similar? Thanks.

I tried the setup with Easy Movie Texture and Google VR SDK for Cardboard on Unity 5.4, but then for stereoscopic 360 video. This means I have two spheres, and a top-down video with the footage for both eyes on it. I show the top half of the video on one sphere and the bottom half on another sphere (following this tutorial http://bernieroehl.com/360stereoinunity/ )

My problem are video sync issues. I’m playing the video on two spheres, but they are slightly out of sync, which kills the stereo effect. Has anyone run into this and found a solution?

To follow up, I solved this by using the same material on both spheres, and creating sphere meshes in 3DStudioMax with custom UV maps where the UV map assigns the top half of the texture to one sphere and the bottom half to the other sphere.

Hi! And thanks you for this guide!
I have a question, maybe you can response that, i need to control the 360 player with udp commands, write a script for async read of the network and the udp reception, but when i move the app to the phone and sent the udp command, the app crash.

Trackbacks & Pingbacks

[…] Now, some people find shaders intimidating. The shader we’ll be using is the simplest one available, the “Unlit/Textured” shader. Once we have our shader, the next step is to create a sphere. Now let’s add support for the Rift. Hit Play and you should be able to look all around at your beautiful photosphere, reliving your vacation memories. Full 360 stereoscopic video playback in Unity | Bernie Roehl. Implementing 360 Video in Unity for Gear VR and Cardboard – Immersive Blog. […]