Search in

Sort by:

Everything VR

Current Space

Question Status:

All Questions With Answer With Accepted Answer

Quick results

Search help

Simple searches use one or more words. Separate the words with spaces (cat dog) to search cat,dog or both.
Separate the words with plus signs (cat +dog) to search for items that may contain cat but must contain dog.

You can further refine your search on the search results page, where you can search by keywords, author, topic. These can be combined with
each other. Examples

Sure, that's possible. It's not hard at all. All you need is a sphere with correct UVs (no, standard sky sphere probably won't work, as it's has UVs with a bit different layout than a typical spherical projection). You'd need some tricks to mask pinching at the bottom and the top (there should be tutorials for that). If your panorama is stereo, you'd need to setup a special material for your sphere. It's simple - you just determine, what texels should go to a specific half of the screen UVs. Like that:

This is for a panorama packed into a square texture, where top half is the left eye, bottom is the right.

You'll see the result immideately in your viewport - you'll see a vertical line like a fracture. It won't be aligned with the center of your editor viewport, that's normal. On device it'll look fine.

Then you just set up your pawn (only camera, and that's pretty much it) in the center of the world (make sure that sphere too is aligned to world 0), make some interactivity to list between panoramas (widgets, widget interaction component, or just linetrace to 3d buttons), and you're set. Since you don't have any kind of control in Cardboard, you'll need to use gaze delay to detect input from the user.

But I must warn you, that the engine isn't really that good with that kind of thing. Its render is too overcomplicated for such tasks, so you'll encounter a significant loss in quality: either bluriness or crawly lines. First means you're using mips in your panoramas (don't do that), second means your panoramas are way bigger than a screen texture used to project stuff on screen.

You can match screen texture with the panorama using r.screenpercentage (number higher than 100, lower than 300), or vr.pixelDensity (value higher than 1), but that's performance heavy, and if you really need a 8k panorama fidelity, it might not be worth it, and you'll have to use Unity or Google VR for Android SDK. More on that here: my question on panorama quality

Thank you for very fast reply, I got it working by changing the mapping of the sphere in which the user is located. However it feels like the user is tiny and close objects seem to be big. Is there a way to compensate that ?

I had tried to convert it to DDS somehow, I dont have photoshop, so I had used the ModifiedCubeMapGen, but the output is always just the colored spaces and not the texture, do you have any experience with that tool ?

Well, that's to be expected. Probably it's the way panorama was shot. In engine, there is little you can do, since the panoramic image is static. Pawn and the sphere should be locked between each other - only movement allowed is the rotation of the camera. You also could tweak FOV of the camera, to try and match panorama curvature. Standard 90 degrees usually works, but if panorama was shot with some special lens, it might be too or not enough curved.

CubeMaps are the different beast entirely, and that's not what you need. It's reserved for in-engine stuff used for static reflections, and it's usually very small (like 512x256), and is in HDRI format.

You need a LongLat projection picture. It looks like that

If it looks different (like more straight vertical lines and no top/bottom, or a cross made from squares), you need to convert it to LongLat projection first. There is a shader command in material editor, that does that (Cylindrical to LongLat, or Cubemap to LongLat), or something like that. Not sure, though, if it works on non-HDR input. It'd be better to convert somewhere else, like modeling or postprocessing software (reproject using special camera). Less complex shaders you use on mobile, the better.

Why use LongLat? Well, it's usually the most efficient way to pack panorama into 1:1 or 2:1 texture. Other methods have their uses, but not for this task.

Huh. Never tried to actually change FOV for VR. It makes sense it's blocked. (Unnatural angular velocity in relation to actual head movement is the best recipe for projectile vomiting, heh)

You can try something else, that maybe doing the same thing. Try to change the scale of the sphere. If I'm understanding it correctly, the bigger sphere is, the lower is FOV angle. Or... not. You'll have to experiment with that.

That was the first thing I had tried, resizing the sphere helped alot (size x10) , but still does not feel natural and "close" objects are super big. Tried with more panorama photos found on the net with the same results. (but not with yours :S) will try later today.