Virtual and Augmented Reality

Bristol-based independent music magazine, Crack, have today unveiled a unique augmented issue. The release is centred around Aphex Twin, one of the most acclaimed figures in contemporary electronic music.

After previously working with Crack on their Bjork In Focus piece, Zubr have created the interactive content using the recently updated Apple AR Kit 2.0 – which pushes the boundaries of Augmented Reality even further.

Within the issue is a 10 page segment centered around a rare interview with Richard D. James, better known as Aphex Twin, as well as artwork from his visual collaborator, Weirdcore.

In the first of its’ kind, Zubr collaborated directly with Weirdcore to turn his existing 3D scans & upcoming EP artwork into an augmented reality app. Users can now experience Weirdcore’s optical distortions on the magazine in 3D.

Understandably chuffed about getting a rare interview with one of the most elusive electronic musicians on the planet in the bag, the Crack team have decided to go whole hog on this one, working with both Weirdcore (who designed it) and Bristol studio Zubr (who built it) on an augmented reality app which promises to bring the magazine cover to life.

To celebrate this interview with one of the most elusive electronic musicians, Crack have planned a month of related content – starting out by hiding sneak peaks of the launch in over 100 locations worldwide, with the help of Landmark.

On top of this, Zubr also developed a bespoke Facebook Filter that can be accessed through the Crack Magazine page. With the filter activated, Facebook users will be able to augment Aphex Twins’ uncanny aesthetic straight onto their faces.

When we mention Virtual Reality (VR), we think of gadgets such as the Oculus Rift and the HTC Vive VR Headsets. Plug it to a computer and immerse yourself into this computer generated reality. With Augmented Reality (AR), we think of mobile applications such as the Pokemon Go. Real world with augmented objects. Pretty cool right? Then there’s Mixed Reality – the Microsoft Hololens head-mounted display is currently on top. But what about Augmented Reality? What is it?

Let’s begin by taking a look at the Reality-Virtuality Continuum published by Paul Milgram and Fumio Kishino in 1994.

From the RV Continuum, we can see terms such as MR, AR and AV. VR is not added here but it on the Virtual Environment section. Before we discuss AV, let’s briefly discuss the rest first.

Virtual Reality (VR)

The term VR involves full immersion in the virtual world. Everything that you see is computer generated. You won’t see the real world at all so mind your shin on that edge of your table! VR has been used for a lot of gaming but has also been helpful with other industrial and medical training.

Augmented Reality (AR)

With Google’s ARCore and Apple’s ARKit, AR has been pushed towards mobile applications. It is useful for adding information on physical objects such as for educational interactivity. Zubr have been able to complete more than AR Projects.

Mixed Reality (MR)

Currently the term MR is unclear as it can be a confusion with AR. But following the RV continuum, it consists of two terms AR and AV. MR, ultimately, is the combination of both real and virtual objects. Now that we have a brief idea of the different terms, we can talk about what Augmented Reality (AV). AV is similar to AR but it is actually the opposite. Within the virtual world, real world objects are composited on top.

Who would want AV?

With the research project Mixed-Reality Production Solutions, we are essentially creating a real-time AV application under the umbrella term Mixed-Reality. The application is to enhance viewer engagement with the player playing in VR. Wouldn’t it be helpful to actually see what the player is interacting with? Wouldn’t it be helpful to be able to see the VR world from the third person point of view?

Hey guys! Allow me tell you a little bit about me. My name is Kenneth Cynric Dasalla, a Research Engineer (RE) with Zubr VR and 2nd year EngD Student with the University of Bath. With just mathematical backgrounds, I challenged myself to try a career that involves problem solving. I have never heard of computer science in high school but with a bit of research, I chose to study it. I have never coded before but I was able to learn it throughout undergrad. I entered with a normal course but I finished with a specialism. I finished the undergraduate degree, BSc Computer Science with Visual Computing, at Cardiff University.

During my final year, an email was sent to everyone about an opportunity for a doctoral training scholarship. Not missing out on the opportunity, I gave it a go and one or two months after my graduation, I got the scholarship. I recently just finished the Masters and transitioned into my second year as an EngD Student with the Centre for Digital Entertainment.

Where does Zubr VR come with this pot of opportunity?

First of of all, the EngD is a doctorate similar to a PhD but with a twist. The structure of my EngD involves 1 year of Masters Taught programme + 3 years of a research project. To make it an Engineering Doctorate, the Research Engineer (that’s me), must be paired with a company during the 3 year research. My status will be a doctoral student but based with the company. Zubr VR (the company I’m with) specialises in Virtual Reality, Augmented Reality and Mixed Reality. The difference with a PhD and an EngD is IMPACT; the finished research will benefit Zubr as well as benefit me in submitting a final thesis (wish me luck!).

The Project

The title of the research is Mixed Reality Production Solutions. The longer version is Exploring the use of real-time depth camera and compositing technologies in video for mixed-reality production solutions – technologies, workflows, implications for content creation. Zubr has already made a start on this from the previous blog post (https://zubr.co/introducing-zubr-vr-mixed-reality-video-studio/). So it is now my turn to continue with the research and find ways for a better solution or an improvement with their current solution.

Current Progress

I am now on my first few weeks with Zubr and already made connections with everyone here. This is actually my first time ever in the industry and it is awesome to see everyone collaborating with their own expertise. I am now settled and it is about time to really begin this research project. Many more updates to follow!

Two leading Bristol virtual reality companies together launch new app that makes it easy for cultural venues to set up their own VR cinemas

New product will preview in beta at Encounters Film Festival, 25th – 30th September at Watershed.

The software system, officially releasing in early 2019 will allow arts venues and cultural attractions to set up their own virtual reality theatres in existing spaces. Groups of ticket-holders can experience curated selections of VR experiences, simultaneously.

After two years of extensive research and testing, Limina saw the high demand from mainstream audiences for artistic VR, including at last year’s sold-out Limina VR Weekender with Watershed. Limina and Zubr plan for the app to get VR to at least 100,000 audience members in 2019. This new straightforward and scalable solution makes it easy for venues to capitalise on audience demand for all things immersive.

The teams’ joint analysis has demonstrated that this product has the potential to disrupt the out-of-home digital entertainment sector, boosting the South West’s global reputation in VR and bringing jobs to the area. The new product will be used throughout Encounters Festival next week in Watershed – the longest running short film and animation festival in the UK.

“We’ve seen first hand the audience demand for immersive experiences in cultural venues. It is this demand that has led to Zubr’s commercial success and our subsequent investment in Limina. We are going to eliminate the current faff that venues have to endure in putting larger audience groups through VR.”

Jack Norris, MD of Zubr

Zubr are a commercially orientated VR development company who invest significantly in their own R&D. As well as being Limina’s software development partner, they are also investing in Limina. Zubr have previously provided in-location VR solutions for venues including the Eden Project and We the Curious, where their experiences have amassed over 25,000 unique users.

“Reducing the barriers to entry into the world of VR is something I am deeply passionate about. Our VR exhibition app means that any venue can smoothly exhibit VR if it has the space and a paying audience. We’re excited to be previewing the Limina VR Exhibition app at Encounters a really pioneering festival.”

Catherine Allen, founder of Limina Immersive

Encounters Festival’s VR strand runs from Weds 26th September to Sunday 30th in Watershed. VR screenings will run throughout the day and are open to the public. Tickets can be purchased from encounters-festival.org.uk

The fiery surface of Venus awaits visitors to the Eden Project’s Expedition Space event running now and throughout the summer as part of a cutting-edge virtual reality (VR) experience.

Eden has worked with Bristol-based VR studio Zubr to create the thrilling experience just for Expedition Space.

Zubr have previously worked on projects for clients including Aardman Animations and the team behind the Bloodhound SSC land speed record challenge.

The Eden experience is designed by a BAFTA-nominated VR designer and immerses visitors in the breath-taking vistas and sounds of Venus, taking them on a perilous journey from a base to a rocket ready to take off.

On the way they explore caves on the planet’s surface and traverse a perilous bridge over molten lava flows.

Players wear VR headsets and move through the game world by walking through a purpose-built area in real life. They hear a bespoke soundscape with bubbling lava and rumbling rockets which changes as the player moves through the world.

This is the first time Eden has hosted a full immersive VR experience. In the past Eden has hosted 360 degree films through VR headsets, but this new experience is fully interactive and has been designed especially for Eden. Some of the technology it employs has only become available in the last few months.

Zubr was formed three years ago by a team of people with a background in TV visual effects and video games development. Their team have worked on many VR and AR (augmented reality – a similar technology where computer-generated elements mix with the real world) projects for various clients.

“VR is one of the most dynamic new forms of entertainment and this is an experience that people can only have at Eden this summer. Where else will you able to set foot on the surface of Venus?”

Expedition Space is the ultimate space adventure holiday starring a Mission to Mars, mini-golf on Mercury and a daring slide down the surface of a moon.

The VR experience costs £5 per person in addition to the main Eden Project admission fee.

“This is a major development for Eden and we’re proud to be working with Zubr to bring this unique experience and exciting new technology to our visitors.”

Our friends over at Calvium invited us to create an experimental installation for their Ideascape public showcase at Porth Teigr, Cardiff – so we challenged ourselves to make a set of augmented reality binoculars that allow you to view the past and future of the surroundings.

Back at the beginning of summer, Calvium invited us to contribute to their exciting new project. Calvium were hosting a community engagement event in Cardiff Bay; a playful sandbox event of concepts and interactive prototypes related to the new Porth Teigr development. We would be helping to investigate how augmented reality could be used to encourage people to engage with their surroundings and stimulate their imaginations – contributing to Calvium’s research into digital placemaking.

My close relationship with Calvium goes back quite a few years – I worked on interface design and user experience for many of their heritage apps. We previously investigated ideas of overlaying historical maps and photos onto modern day views in Hidden Florence, a placemaking app designed for the Renaissance city. And when it comes to visitor engagement, we had great success exciting adults and children alike with the Tower Bridge Family Visitor Trail, which has interactive games that use a very physical element as well as accompanying printed material.

Working with Zubr over the past couple of years has taken me in quite a different direction; with realtime 3D graphics and head-mounted displays being a few steps away from placemaking apps. Nevertheless, I think there is such huge potential for content to cross over here – which is why me and the others at Zubr were excited when Calvium posed the question…

Our carpenter, Luke, went to town building the binoculars

“How can we visualise Porth Teigr’s history and future, in situ?”

We already knew we wanted to play to our strength of creating dynamic, visual experiences, and come up with something that simply allows people to visualise the past and future – literally. After some discussion and a couple of site visits to Cardiff Bay, our idea settled into the form of augmented reality binoculars, with a physical lever to transport the viewer through time. We took our inspiration from the metallic coin operated binoculars that are ubiquitous at many seasides and landmarks around the world – perfect for inviting anyone and everyone to experience cutting-edge technology without needing so much as a smartphone themselves.

We were confident that one of our well-established Cardboard VR deployments would fit nicely into the unit. Our fabrication engineer, Luke, was also very excited to build the physical viewer – seizing the chance to really indulge in making a beautiful physical piece that sparked people’s nostalgia – and referenced Porth Teigr’s industrial past!

Making use of Open Source terrain data and reference material

In order to make our software work perfectly for a very specific position, we used open source DTM and DSM lidar datasets from the Welsh Government to recreate an accurate 3D model of Cardiff Bay. Combined with 16K 360 imagery we captured on site, this gave us a real-world foundation on which to build the augmented world view.

Welsh Government Open Source Digital Surface Model of Cardiff Bay

The past: Old ships, trains and buildings augmented into the world view, exactly where they would have been

A map of Cardiff Bay from 1885, used as reference

From there on, we used some awesome reference material – including historical photographs, maps and concept art of future building developments – to help us figure out where (and when!) to position various augmented buildings, ships, trains and other objects. The end result of all this effort is that you’ll see an old steam ship in a dry dock, exactly where it would have actually been 100 years ago. You can even watch an old coal train rumble past, following a railway line traced from a 1930s map.

Adapting ARKit and ARCore for 3D Calibration

After conducting various tests with Apple’s ARKit, we decided to use it for this project. The reason for that might not be obvious, considering the built unit could just use a gyroscope to control rotation. However, we wanted to see if ARKit could be used to ‘calibrate’ our digital twin of Cardiff Bay – effectively allow the augmented reality objects to realign themselves correctly, according to where the viewer is positioned.

In short, what that means is, the augmented view will automatically change if the binoculars are relocated. Amazingly, there is basically no limit to how far you can go. If we decided to move the binoculars 100 feet away, next to that old railway line which is now a road, this is what you’ll see…

And if you walk for 5 minutes across the bay, and look back at the old dry dock next to the Gloworks building, the big ol’ 1900’s steamer will still be securely docked, This clearly has huge potential for portable, user-device placemaking and trail apps.

Towards the end of the project, Google unveiled their answer to ARKit: The similarly named ARCore. With the chance to make use of a much greater screen resolution, we immediately jumped ship and converted the project to Android, settling on a Samsung S8 smartphone as the device of choice.

The night of the event

The augmented reality binoculars were featured as a central part of Calvium’s Ideascape evening. After a few teething problems with the physical unit, we had to swap the time-travelling lever with a flathead screwdriver. Nevertheless, that seemed to add to the charm, with many visitors being quick to make the connection between our handy time-travelling screwdriver and a certain Doctor Who Experience barely 100ft away!

Zubr Developer, Lukasz – on the right – has a cider in one hand, Sonic Screwdriver in the other

The AR binoculars went down extremely well with local residents and out-of-town visitors alike. That included city planners and architects, who were thinking about how they could use such a device for the visualisation of building developments and heritage applications. However, we were most excited by the reactions of local residents. Not necessarily tech-savvy, and not particularly accustomed to time travel; most people fully understood how to use the unit – presumably from cultural familiarity with the idea of coin-operated binoculars.

“People loved interacting with this installation, which mixes the nostalgia of seaside attractions with the magic of seeing a landscape change before your eyes. The fact that you don’t need anything to take part was key, allowing anyone, smartphone user or not, to walk up and see the past and future of the area spring to life around them.”

– Calvium

But most strikingly, most residents wasted no time spinning around to view Porth Teigr’s future building developments – inspecting the height of the buildings, seeing how far they will stretch down the bay, and speculating about coffee shops and pedestrian access. Various couples shared the binoculars and started their own conversations about the residential development – without a word about smartphone screens or ARCore. The technology we had created had become invisible, allowing anyone to peer through to the future in a way they already understand.

We can’t wait to see where these ideas will go next – a city-wide 3D heritage trail? Permanently installed Augmented Binoculars on the streets of Bristol? A super-engaging and accessible form of community engagement for city planners and developers? There are a few ideas in the pipeline, and who knows, you might get to peer through these binoculars sooner than you think!

Here at Zubr, we tend to allow ourselves a wide margin for error when it comes to defining the boundaries between augmented, mixed and virtual reality.

Perhaps more than any other company, we’ve had our AR/VR/MR boundaries mixed and mashed up right from the start – and I think that’s a good thing. Never have we concentrated on one format and then suddenly ‘discovered’ another – it has always been more like “AR/MR content that switches between mono and stereo view!” “AR-enabled positional tracking for VR content!” “A scene that starts in MR until you step through and it becomes VR!” “Mobile AR containing mini VR scenes!”

“We don’t think of Google Cardboard as a real VR device, because it doesn’t have six degrees of freedom/positional tracking.”

2014: Bringing Vuforia and Google Cardboard together

After experimenting with the Vuforia AR engine in 2014, we weren’t really interested in making augmented reality. Despite previously working at Bristol’s leading AR technology supplier Kudan, and being well-aware of the possibilities, our hearts were firmly set on exploring what we could do with the Google Cardboard. Excited by the presence of a hole for camera passthrough and even a ‘3D token‘ disc supplied with every cardboard, we started blending Vuforia’s AR and Google Cardboard’s VR capabilities together, something that, to our surprise, no one else was interested in at the time.

One of Zubr’s 2014 experiments in combining augmented and virtual reality

We arrived at a point where we would use AR image targets of different sizes and shapes to position 3D content in space, viewed in true stereoscopic 3D. Larger image targets could even be used to enable 6 degrees of freedom positional tracking for non-AR virtual reality scenes. An animated GIF of one of our early demos of this type posted on Twitter drew the attention of Google’s VP of Virtual Reality, Clay Bavor, and Vuforia’s Head of Marketing, Liz Philips, who was perhaps surprised that we hadn’t built it with Vuforia’s own VR compatability toolset that had only just been released.

2015: Augmentation with depth-sensing devices

To continue the success of the original low-cost Cardboard VR device, Google unveiled the Cardboard V2 in 2015. Though many aspects of the design were improved, such as the lens, build quality and portability; Google had removed the camera passthrough hole on the front. Through doing this, Google had effectively chosen to separate AR and VR, at least for the time being – perhaps in recognition that no developers had created any meaningful AR/MR content with the first Cardboard headset.

2016: Microsoft Hololens woos the corporate crowds

A new type of headset that creates visuals based on the centuries-old technique of Pepper’s Ghost, and depth-sensing abilities derived from the Kinect, the Microsoft Hololens very quickly became the ruler of a new segment of the industry – keenly pushed by Microsoft – called Mixed Reality.

It’s an impressive new direction for head-mounted hardware, and we’re certain that there’s a great deal of mileage in it, but sadly, the Hololens currently suffers from its’ very own Hype Train. Perhaps spurred by the business-class-feeling £3,000 price tag and relentlessly circulated stock images of ‘business users’ reaching out to touch ‘holograms’ in amazement, the Hololens has made incredible inroads to corporate environments, being demonstrated as an effective option for technical training and health & safety amongst other uses.

And, in a clichéd attempt to prove its’ worth to tech-savvy business users – despite having a field of view the size of a business card – most of those stock images you see will include a little OTT. No, not Over-the-Top – I’m talking about the Obligatory Turbine-Thing…

A Business Hololens User indulges in his OTT

Meanwhile, as we move into 2017, we can see increasing evidence that the previously-almighty VR hype train is slowing down fast. Sadly, some VR companies depended heavily on this hype translating to real business; and many are now struggling.

So what fresh innovations are popping up to keep everyone interested? Enter Apple.

2017: Apple unveils ARKit

There has been speculation going back over a year that Apple had been planning a major augmented reality toolset, with a number of relevant business acquisitions. In Spring 2017, the rumours turned out to be true – and Apple unveiled their awe-inspiringly-robust trackerless SLAM (simultaneous localisation and mapping) augmented reality system – ARKit.

We had already been testing our content with Kudan’s impressive SLAM system, which works across iOS and Android – so we were quick to try ARKit, and we not disappointed with the results…

“Enables Developers To Create the Most Innovative AR Apps for the World’s Largest AR Platform.”

Naturally, we’re absolutely delighted that robust SLAM for mobile devices is finally here. Needless to say, it wasn’t long before we went full-circle and created a stereo split for ARKit demos, meaning you can experience the powerful AR engine through an immersive head-mounted display. In fact, we went full-on Microsoft, and created our very own Obligatory Turbine-Thing demo…

For many people, this is an exciting new development that has enticed them into the augmented reality world; simple and yet powerful enough for anyone to focus on creating fun and original content, instead of worrying about how to set up tracker images – which is great. Some VR developers have even hopped straight over to ARKit simply to explore its’ abilities for 6-degrees of freedom positional tracking, and they haven’t been disappointed!

So, if we look at the big picture at this point; it’s not hard to see that we are on the cusp of a real AR, MR and VR melting pot – with technologies overlapping, developers crossing boundaries and even content that swings from one format to the other with the touch of a button.

VR Hardware will be reinvigorated by the release of new standalone headsets – independent from PC units and yet more capable than smartphone headsets, they hold the potential for a renewed push to bring VR to the mainstream.

Also, dedicated MR headsets will soon be affordable and versatile – for business users and entertainment purposes alike.

Finally, AR is likely to retain pole position out of the three technologies in regards to general uptake, at least for a while.

For us, it’s very exciting that hybrid experiences and an ‘XR’ approach are finally gaining traction. Although, that is business as usual in our studio – looking at our portfolio is sort of like taking an ‘Is it VR, MR or AR?’ quiz. For the big players – we’re all expecting Apple to cosy up more with VR technologies, especially now ARKit is out there. For Google, expect much-increased interplay between their VR and AR ventures, as they attempt to catch up with Apple on the AR side.

Oh, and Google – We knew you were making a mistake when you filled in that camera passthrough hole on the Cardboard V2 and Daydream headset. If you could please bring it back for your next headset, we won’t have to use craft blades so often!

You might think that we spend most of our time developing AR and VR applications for smartphones – but we actually do plenty of experimental R&D with different cameras, scanners, VR headsets, inputs and physical setups. It’s all part of the philosophy of pioneering new ways of making content, and continuing to create accessible experiences with off-the-shelf hardware, which we then feed into our other projects.

One of these research projects is our ongoing exploration into a Mixed Reality Video Studio.

Meet Mixed Reality Video

Making a mixed reality video is generally considered to be the practice of creating a video which make it look like the VR user is actually placed in the virtual world they are seeing.

As more and more studios create awesome VR content to show off, the popularity of producing mixed reality videos has risen sharply. For most purposes, a simple green screen setup and basic camera synchronisation will produce a pretty sweet video. But what about making something a bit more advanced, where you want to embed the video footage right into the middle of the scene, with virtual objects not only behind but also in front of the person?

Like with any video production, you can always push your video through a conventional post production pipeline, spend some solid hours compositing footage with foreground elements in Adobe After Effects, and end up with something which is broadcast-worthy. That’s fine, but it can easily become restrictively expensive, and, well, we’re not a video post-production house.

Depth test with Jack and Laika

Realtime leads the way! Again!

Which brings us onto the next possibility: Compositing video footage directly into the VR game engine, in real-time.

We’ve played with depth-sensing cameras such as the Microsoft Kinect since the beginning. We have Kinects, Zedcams, Realsenses and Tangos permanently scattered all over our studio, being put to use for game inputs, volumetric video capture, 3D scanning etc. So why not use depth-sensing abilities for compositing mixed reality videos?

Well, that’s what we thought when we started using the Kinect V2 for some early efforts at realtime MR compositing. However, the ‘bubbly’ noise produced by the depth image, coupled with the infrared interference with the HTC Vive, made the Kinect V2 a difficult choice for this. That’s why we started looking into Stereolab’s ZED cam – which calculates depth values from the separation between two RGB cameras.

Clearly, many people around the world are exploring similar ideas. Most notably, at the same time that we were getting stuck into it, the geniuses at Owlchemy Labs – a VR games company in Texas – blew away any expectations of what can be achieved with realtime depth compositing with the video clips and explanations on their Mixed Reality Tech blog. With their VR game Job Simulator being a massive hit across all the high-end VR devices, these guys have clearly made a priority out of finding an intuitive way to show audiences what it looks like to see a person immersed in their virtual scenes.

Anyway – the ZED cam does a very nice job of producing a realtime depth map. It isn’t perfect – part of the depth calculation is an algorithm which basically smooths/estimates some depth values. Notice the soft, blurred areas in the depthmap above. But for the most part, it’s very good. That is not to say it is in any way easyto make this thing work how we wanted it to – blimey!

So, we mounted this camera on a special rig along with a HTC Vive controller (for positional tracking, which keeps it synchronised with the position of the virtual camera), an Android smartphone (on which we render a realtime virtual viewfinder app for the cameraperson to see what they’re filming), and a game controller (for the cameraperson to control virtual zoom, exposure and lighting controls).

Our progress so far

The camera operator can adjust virtual zoom, exposure and lighting controls from the physical camera rig

The subject can both cast and receive lights and shadows from its’ virtual surroundings

The subject is correctly depth-sorted in the scene, even behind transparent and translucent objects

Current areas being worked on:

Our greenscreen/keying setup is a quick fix – this is the reason for the dodgy edges in the video, NOT the depth feed! So, we need to upgrade our keying facility, basically.

Lighting and shadows can be smoother

Calibration needs to be easier

Where to next

Increasing general reliability and flexibility of the system is a big step to conquer next before we can see how it works in some real, non-demo content.

Adapting it further for TV/Broadcast users is an important one for us. We want real, human-world camera operators to feel at home with our absurd camera rig.

Integrating a 3D-scanned face overlay to the mixed reality result, inspired by Google’s experiments, is a great way of making sure we can still see the virtual user – we’re working on it.

Expanding usage to non-VR – At the moment we’re focusing on how this solution applies to virtual reality content. However, in the near future, we will be expanding its’ use cases to meet conventional practices in broadcasting and the wider media industries (Think along the lines of the hilariously expensive BBC News Virtual Studio, but so versatile you can even use it in the field).

Interested in the Zubr Mixed Reality Video Studio?

We are interested in making partnerships in broadcast and media industries to help bring this solution to fruition. Please contact us if you’d like to learn more about the system, arrange a demonstration and perhaps work with us!

Zubr more than doubled in size and business activity, and we’re on track to do the same again in 2017. So, what have we been up to?

We worked on a total of 26 immersive media projects in 2016. That includes 15 Android-based VR/AR/MR experiences, 3 HTC Vive builds, 3 permanent physical installations, and 1 aerospace project so secret we can’t tell you anything about it.

We created the World’s First fully automated 4D scanning installation, in which the scanners have been fired up more than 100,000 times since it was switched on. We produced and released an unparalleled Open-Source development kit for the Fulldome planetarium format, opening up a difficult format for everyone. We invested heavily in Research & Development, developing our own realtime depth-compositing solution for the television industry, pioneering realtime 4D scanning deployment for the performing arts, and devising a unique content management system for web VR experiences.

We have forged some brilliant partnerships with innovative organisations; becoming a close delivery partner with the At-Bristol Science Centre, an augmented experiences developer with Calvium, a WebVR collaborator with Yadda and a multi-faceted VR production team with Trainrobber and Wolf & Wood. Together with these awesome people, and many more, we have some huge, groundbreaking projects lined up for 2017.

So, could this be where you come in?

If you’re a Unity developer, technology tinkerer, or Jack-of-all-trades (which is a good thing by the way), and are interested in joining our team, we’d love to hear from you.

Don’t worry about having any experience of working on VR projects, we’ll soon fix that!

It’s a simple but powerful immersive environment where you can look around at some exclusive content from this month’s Bjork features in Crack Magazine.

To create the virtual environment, we created custom artwork to fit with the Crack magazine’s cover feature of Bjork – including a seamless 360 cubemap and meticulously dimensionalised key images. The experience was built using Zubr VR and Yadda’s awesome new Enso Experience platform, which is a versatile, CMS-controlled system for creating great VR content that works right in the browser. Crack Magazine also made custom Cardboard viewers to go alongside the experience!

Head over to our Bjork’s Future project page to check out some screenshots, or click here to go straight to the experience itself (that’s right – no app to download).

It’s been a busy year here at Zubr VR, and we’ve gone through a lot of changes – the most recent of which is the overhaul of our website (welcome, by the way).

Recently we have been engaging closely with some fantastic people and companies around the world. We have built relationships with Trainrobber, Yadda, Calvium, Mimesys, Wolf & Wood and At-Bristol to name a few, and together with them we’re working on some really interesting projects. For example, our Test Lab project with At-Bristol is the first of its kind anywhere in the world; an automated system capturing incredible 4D scans of its visitors and rendering them as augmented holograms on the table in front of you. There’ll soon be some really exciting upgrades applied to this, increasing the quality of the scanning, and allowing visitors to see their holograms on the moon, amongst other places!

With our fingers in so many pies across different industries (sorry we weren’t more specific, business advisors – we just really love making cool stuff for everyone), we’ve had to tackle a lot of research and development work to get to where we are now. But, this is great, as it’s helped us identify our six main areas of expertise. For each of these we have developed our own unique approach and set of solutions. This includes a number of innovations; such as our automated LAN timeline system for Test Lab, a CMS-controlled system for browser-based VR content, our compositor-based live action scene dimensionalisation (bear with me), and our method of hybrid 3D scanning techniques. I won’t get into too much detail, or we’ll be here all night, but check out some of the projects we have here on the website to get a closer look.