I downloaded the merge two mp4 but it would not work, its says is ISO but to me it look like an exe file.

That's really odd, but I too not sure about, i think it's the codec, as while APV file output on my end it says .mp4, I am unable to view it with win8 video play. I am able to view it on a VLC. And I have weird experiences when uploaded to my server & view it through the web on my mac, choppy on my PC, not able to view on my ipad, and not sure about android. When the file sits on my windows machine, the .mp4 file icon is an APV icon and not a video icon. I'll have to ask around some of the film folks i work with about what's going on there. Ultimately, I know I will have to send it through an App called Handbreak to optimize for the web.

I downloaded the trial version and reprocessed your videos... Now I have two MP4 and two tifs...

The two tiffs I assume was generated when edited the videos pano through APG, necessary to create a "template" for how the .mp4 output will look like.

The tutorial is very limited... Perhaps its assuming some prior knowledge..

I found the this Kolor APV wiki to be quite adequate in the process of rendering the a single pano video. And Kolor's wiki for 3D assuming that you read the single pano wiki tutorial. Learning curve is far less than when I first got APG & PTP back in 2011. The only thing absent for now is merging left & right, which Kolor points to ffmpeg. And understanably is outside kolor's support. I am hoping people of this forum can come to the rescue.

a new advanced technology really deserves a much better tutorial, assuming nothing... There needs to be lots of visual too.. This software has been available for a while now so.. where is the How To Do It... Is this the best Kolor can come up with.. http://www.autopano.net/wiki-en/action/ ... 360_videos

Again, there was that other tutorial. The 3D aspect is still rather blazingly new. Prior to the Oculus Rift [OR], there really wasn't a practical platform for this format of media. The first [OR] built just a little more than year ago, late 2012, only recently catching a lot of wind as of recent, and still in developers stage of production. I first heard of it in August of 2013, and first heard of 3d Video in December. I'm not surprised of the little documentation. Kolor's is actually not too bad though, just need the merging of the two vids.

FFmpeg is the only free software that I am aware of. You get what you pay for & things still cost. Where I'm trying to cut corners on $ is costing me Time to learn. There is another software that I'm going to explore. It's a plugin for Adobe After Effects called V3. I'll need to subscribe to the creative clouds AE & purchase V3. There lite package is only $50 and seems to do all that I need, side by side merging. Still will need to learn how to use AE. Fortunately, I've got an expert in the studio.

I was wondering if there would be a time when this headset could be used for a normal image VR Tour where the hot spots could be actioned in Virtual Space.

I was able to produce a single pano view in 3D and viewed it through the OR. There is this section of KRpano.com I haven't had much time in exploring making a full virtual tour. Honestly, if i recall correctly, there is issues about hotspots & html5... similar to the problems of making a vrtour with video panos. PTP just isn't able to support mobile, but i believe flash is able to support it. I'd have to spend some time turning my wheels to understand the current situation in detail.

I managed to test out your 360 video and was really disappointed with the quality of stitching.

1) I used the 360hero's 3D. Video panos are inherently poor on stitching b/c of the size of the cameras unable to capture at the nodal point. I think this is exasperated by stereoscopic nature. 2) I didn't spend any time attempting to improve the stitch in APG mainly because I'm racing to prove that I can actually output a stitch image. The test.mp4 was more about ensuring that I have the settings correct on the GoPros... Earlier, I was following Jason Fletcher's post on the fundamentals of 360 video. His Gopro settings are appropriate for his 360 hero's rig, but not mine. Looking into documentation given from 360hero, I found I have to use 1440 and not the 2.7k resolution. At 2.7k, the camera wasn't grabbing enough FoV for the stitch, and was a terrible mess with with much black holes. At 1440, I can now capture enough... not perfect, & not great, but enough. 3) My test.mp4 had the cam rig just sitting on my desk and near stuff, like the edge of my desk. It is from my understanding that, stitch parallax error is less noticeable if you provide at least a 4 feet (1.25 meter) distance from the subject and/or object with odd structural form (not the floor). You can have a subject nearer, but it's critical that a camera is facing directly towards it and not extending beyond that individual cam's FoV. - Again, the stereoscopic aspect of my rig yields more noticeable error to up-close objects. Definitely "operator error" ... On my upcoming experimentation, I will direct the film accordingly.

Your movie runs a heap better in the KolorEye player

I haven't gotten to the distribution stage of my experimentation. I expect the KolorEye player to be sweet. Again, I think it all comes down to format or codec... I'll talk with my film folks about it. Ideally, I'd like a way to bundle the 3D videos in a form of an app, will talk with some folks about that later.

I am still confused.. Am I right in saying you used one of these huge heroes 3D rigs to capture the video to create two videos with 12 GoPro cameras, which when viewed will display in your Oculus Rift, but the view we get to see on a PC or Mac would be two movies playing..

Correct. I'm using said 3D rigs mentioned above. Capturing two vids that will be merged side by side for the rift. The OR mirrors the display of the desktop in the rift. While the displays a side by side, the rift head set has lenses (hence the name Oculus) that feeds each eye half of the display. This video explains the technical details of how the rift works.

Am I right in saying that a 3D pano is kind of red/green blurred Stereoscopic image but it can also be two images or videos sitting side by side.. In your case, two videos sitting side by side.. So that code above is mean to aline the two mp4 images side by side... Thats the bit I am missing the plot on..

Its all very exciting but I feel its all too early to invest in. There is nothing in our world that is 2D, 2D is just a media we have crated... In the real world, everything is 3D...

To be honest, I admire the 360 Heros mob for coming up with this innovation but I just feel its not the right approach... First came the GoPro then came the rigs.. The thing is, I do not believe that the GoPro is designed for this.. Its not the GoPro fault, but I am sure they could think about redesigning their camera more around the suiting the rigs that are advancing in this media area...

I have made some inquires about if a pano could be used with an OR via a Tablet, but at the moment, Android support is not fully implemented... Such as the KolorEye... I wonder when that is coming..

This 3D thing is getting confusing.. I guess it will make more sense the more I test it.. I do not have a OR so hard to visualise what the end results would be.. I see that Stereoscopic red/green offset in an image, but is it possible to achieve with a video.. To me, its seems there would be far less sitting issues.. But that might be a stilly statement since I really do not fully understand all this yet...

Keep up with the experiencing.. I am sure it will pay off in the end...

Hi Andrew. I think you need FFmpeg on your system.. Perhaps thats what that link Kolor provided is part of... Even though its an exe file, I think its a Unix file which works on Mac.. Not sure at all....

Andrew... If you download the krpano-stero3d-v2, be careful with the sbs3d.html file. Using my Mac it totally locked it up and continue to rotate very slowing in a giggly manner.. Pressing esc did nothing... This is all very experimental stuff so Klaus I think has a bug or something with this file... Either that or my old Mac is just not up to opening it.. You might have better luck.. It should work with your OR...

I am just checking out the krpano code and oh man.. Klaus is just so smart.. I want to marry his brain

I can't help wondering, is all very well having this technology but most computers users would struggle to view the media... It seems to be very hardware dependent.. I guess everything will catch up in time as the demand grows and new technologies are developed. I can most defiantly see the value in Virtual Reality Headsets..

I still cannot work out how to get two videos to sit side by side... I found this software but it will not accept MP4 movies.. I converted them to MPEG1 and AVI but they did not work.. emmmm

I found this and its really quite amazing.. It seems that there heaps of work and skills in creating 360 video.. The poor stitching on your movies seems it can be fixed up to a high degree using APG 3.5.. To be honest, I did not not know APG can be used to for control point editing of video.. I do not have any raw video to test all this.... Are you able to send me some un-stiched videos, so I can begin to gain a better understanding of the 360 video process..

With my limited understanding of the parallax issue and how they cannot be resolved due to two camera being unable to locate in the same space to me is fixable.. Because I have limited knowledge, I tend to look at things less complicated.. I feel that since the Rigs were designed around A camera, which would be impossible to achieve a perfect stitch due to their position in the Rig, it therefore falls onto the software, but.. not entirely.. Its been said that the rigs are as small as they can get.. So to me... in that case make them bigger.. This will all be worked out by maths... In my opinion the cameras need to be move out to a calculated value.. There are so view variables since only one camera and two Rigs are the norm. Programmed software knows the positions of the cameras so it can pull the images into a perfect sphere allowing for overlap. At the moment the NNP cannot be set since the lens is offset... So in that case design a Rig where the lens can be set to the NNP. It is possible if the Rig was bigger to achieve this.

If anyone else has raw 360 videos which can use for testing I would appreciate it.. Not sure how my iMac will coup with the files since more RAM did not improve the performance.. When I see videos in real time responding so quickly it make me oh so sad.. A new iMac is on the Top of my shopping list..

Destiny wrote:At the moment the NNP cannot be set since the lens is offset... So in that case design a Rig where the lens can be set to the NNP. It is possible if the Rig was bigger to achieve this. Destiny...

Yes Andrew, thats my point, two cameras cannot occupy the same space which is what is need to achieve a perfect result. So.. , I suggest the answer might be to make the rig bigger and allow the software, to bring the images in since the distance they are set out is known so the software can be told.. Bring image in X value.. Then stitch..

Destiny wrote:At the moment the NNP cannot be set since the lens is offset... So in that case design a Rig where the lens can be set to the NNP. It is possible if the Rig was bigger to achieve this. Destiny...

No, Idon't think so.

The cameras need to be even closer together for each to be at NPP.

That´s impossible. You can´t have a common NPP for two cameras AND a stereoscopic view at once. That´s just physics: you need a certain distance between the two lenses. Look at your eyes: you can see 3D because your eyes have a certain distance fro each other. Cloes one eye and you´re not able to see 3D.In pano-hotography it depends very much on the shooting distance - just as it is in usual panorama-photograpy: shooting long distance you don´t need to have a precise NPP. Close distances you need to set the NPP very precisely.I explained that some months ago to be the basic problem in stereoscopic panoramas. Saw several experimental stereo-panos since the last Photokina. No one worked acceptable.

In cinematography they use floating parallax-correction by moving the parallaxe between the lenses according to the actual shooting distance.In movies that works well - you see each image just for 1/24sec. while the whole scene moves (btw.: this technology is extremely expensive)

Destiny wrote:At the moment the NNP cannot be set since the lens is offset... So in that case design a Rig where the lens can be set to the NNP. It is possible if the Rig was bigger to achieve this. Destiny...

No, Idon't think so.

The cameras need to be even closer together for each to be at NPP.

That´s impossible. You can´t have a common NPP for two cameras AND a stereoscopic view at once. That´s just physics:

Well perhaps the GoPro is not the answer.. At $600+ for one camera, that's $1200 for a 3D capture... Perhaps a different approach is in order... A 3D lens or a camera that captures 3D.. That way only one camera is occupying a single space, therefor the NNP must be better.... and since a single cameras is capturing on 3D scene, there is less chance for error.. The Rig would also be smaller..

So, my kolor account use to send an email whenever someone would post a comment in the thread. I am no longer receiving those notices. Thus the long absents. Also, I've been very busy with developing business.

I just want to report back that I have rendered a music video in 3D 360. Despite concerns of the newness of the tech, GoPros, and such, the result was shockingly more impressive than I expected. I'm currently looking at a means of publishing & distribution.

ffmep is a powerful tool, but im not finding a decent plugin to offload to the VPU.

ATXcloud wrote:I'm just a die hard passionate individual that has a vision and sees opportunity.

Good lad! Welcome to the club

ATXcloud wrote:I'm taking a risk at a narrow market by throwing my life & lifesavings and every earned dollar towards the prospects of an emerging technology.

That´s tricky: i´m into this for years now. Professional photographer (advertising) and former cameraman (commercials, documentations) for about 30 years and producing interactive content for about 8 years now.

WHO needs 360° video at all? Still very hard to find anybody who is willing to pay appropriate amount of money for it.

Who might be it?1) advertising - no. Quality still isn´t good enough. Media not ready for it.2) TV stations - maybe, but they don´t pay enough and don´t all have the technology so far.3) News Gathering in Online-Magazines- maybe - but they also don´t pay enough and hesitate investing into the technology.We produced interaktive panos for the most important german news-magazine "Der Spiegel". Good for the image, of course.4) touristic - they´re used to cheaply produced images.

So it´s hard to find clients for 360° video anway - can you imagine how hard it will be finding clients for 360° video in 3D ?

There are SOME bootstraps: 1) it´s rather new. 2) Things change very fast. See alone the GoPros - how many changes in models and features over the last two years? Some!3) Who NEEDS interactive 3D 360° videos?? You see: it´s "nice to have" surely. But "nice to have" does not pay the money you need to produce real good work! Clients pay a good price for what they NEED . . . not good prices for things being just "nice to have", but not essential.4) 360° video or panos 3D are nice - but too far from being perfect yet.5) 3D needs to be displayed in excellent quality - how many people already have the displaying technology? Few. Very few.6) because of that all potential clients are somewhat confused these days whether to take 360° video into account - and more over being it 3D . . . no way making enough money with it actually.7) for toying around with all that you´d better be rich or win money in some lottery . . .

Not to forget: many (!) people get confused VERY fast viewing 360° video anyway! Especially many people over 50 don´t like it at all - for many reasons. This wil be even more extreme when it becomes 3D . . .

I read some marketing-researches: it all will come. Some day . . . .

Investing lots of money in the technology NOW means to loose plenty of money for quite a time by updating the hardware every 6 or 12 months.

Conclusion - just my conclusion, of course - i would not invest about 10-15000 bucks for soft-/hardware without being sure to have a return of investment in an acceptable time-scale.

Time´s not ready for this kind of technology being lucrative.

But i´ll have a close look on it at the Photokina this fall, of course . .