Virtual Reality

Working with urban regeneration developers Stow Projects and architects Doone Silver Kerr we’ve developed a VR simulation of a new sustainable and eco friendly apart-hotel made from re-purposed shipping containers. This particular simulation presents a unique opportunity for us to showcase room scale VR and our new untethered HTC Vive headset using TP Cast, freeing the viewer from the burden (and trip hazard) of the headset’s ‘umbilical cord’.

As a result of the compact size of the apart-hotel bedroom we were able to track the viewer’s location in the virtual simulation using just two lighthouse sensors and, in addition, were able to use an empty shipping container at The Artworks Elephant of exactly the same size in which to install it. Consequently, we’ve curated a VR experience which takes place in physical space with the same dimensions as the luxe apart-hotel room, greatly enhancing the ‘presence’ of the walk-through.

Through the combination of the wireless HMD and the one-to-one mapping to the physical space, one can grasp the full power of a VR simulation, walking unimpeded as if in the very hotel room itself. Reaching out to touch walls and ceilings in VR and feeling them in reality.

Next week we open the experience to the press and the VR community to try for themselves. If you are interested in coming along please book an appointment, details below.

About Stow Away HotelLocated in the heart of Waterloo, within the vibrant cultural hub of Lower Marsh, Stow Projects, Ciel Capital and Bridge Street Global Hospitality are opening the affordable luxury Stow Away apart-hotel at the end of the year. Designed by Doone Silver Kerr Architects, it is a fully sustainable and eco-friendly project made from 26 re-purposed shipping containers for comfortable short- and long-stay living in central London.www.stow-away,co.uk

About Stow Projects
Stow Projects are a creative property development and urban regeneration company. They form partnerships with landowners to fund, develop and operate workplaces, apart-hotels and residential accommodation. Regeneration and sustainability are the key drivers behind their projects, and they find ways to unlock difficult sites through innovation, for example with the use of flexible modular units. They aim for all their projects to mature into new places that will become socially and economically integrated into boroughs undergoing great social change.www.stowprojects.com

VR History

The idea of VR has been around long before the first computers arrived. Photography drove the idea initially with our desire to step inside the camera ‘space’. In 1838 Sir Charles Wheatstone patented the stereoscope and the stereo viewer was born. The key feature of the viewer was to provide each eye with a slightly different view of the scene, thus replicating depth and a powerful three dimensional effect. At a time when photography was still a form of magic stereographs were extremely popular and remained so for over a 100 years. Fast forward to the 1990’s it was therefore no surprise that immersing ourselves inside a computer became a new obsession, and the desire to inhabit this device like the Edwardians did with cameras, could explain why the idea persisted. VR quickly gained a cultural significance marked out by cult books like Snow-crash and films like the Matrix, during a period where computer power increased by over a million times between 1995 and 2012. It was in 2012 that Oculus Development Kit 1 was released during the now infamous kick-starter campaign providing the first viable consumer hardware so VR development could start in earnest. The basic premise of the device remained similar to the early stereo viewers, presenting the world in full 3d albeit with a much wider field of view for better immersion.

18th century stereo viewer

HTC Vive 2017

Adoption

Like with any new technology the VR adoption trajectory has followed a familiar path. Initial excitement and wonder gives way to a more realistic outlook of the limitations and challenges of the medium. VR is slightly complicated in this respect due to the symbiotic relationship between the hardware and the software ‘content’ designed to provide the experience. Whilst we can rely on the hardware to improve over time, in much the same way we do with phones or televisions for example, there is no similar assurance with content because we are still learning what does or does not work well; VR is a computer without an operating system, a completely blank slate. Despite this the use cases in architecture are seductive and present opportunities for how we can engage our clients and ourselves in ways that improve how we currently work on a screen. We can finally enhance the act of looking or seeing and give the user an ‘experience’ that if executed properly will provide far more bandwidth than traditional media can.

Hardware

VR hardware is extremely difficult to make work and despite decades of research and development the technology was simply not available to produce a fully functioning VR headset. The best systems cost millions of pounds but were still uncomfortable and not particularly immersive. Only recently due to several technologies maturing at the same time, has a working VR headset become possible. John Carmack, the creator of Doom and the world’s first 3d graphics engine built the initial prototype, and since then we have seen some of the sharpest minds in programming and engineering contribute to the hardware we see today. Oculus Rift and HTC VIVE amongst others represent the first generation VR devices and despite how advanced, they are in many ways analogous to very early black and white television sets; they work fine but there is room for improvement in key areas like resolution and field of view. Aside from this we still achieve a feeling of immersion because the latency is low enough and the field of view high enough (20ms and 90+degrees respectively) to give the user ‘presence’ (a feeling of being there).

John Carmack demonstrates the first “Oculus” prototype – E3 2012.

Real-Time Engines

A VR scene must provide 90 images a second for a comfortable experience thus requiring a much faster method for rendering. Interestingly During the 90’s and 00’s (the same period that we dreamed about VR hardware) came the development of rendering techniques for computer games such as DOOM, which permitted a 3d scene to be navigated in real-time. These early software engines were also a catalyst for graphics hardware and drove the resulting (GPU) arms race which today is dominated by Nvidia and AMD. Since this inflection point the development path between graphics hardware and real-time engines has been closely aligned, and the boundary between hardware and software functionality overlaps frequently.

Driven as much by a pure spirit of discovery as well as the games industry itself new announcements and publications continue to advance real-time technology at annual fixtures such as Siggraph and GDC, whilst the hardware continues to exceed Moores law. At the time of writing it is possible to get near cinematic quality at 90fps using a decent quality consumer graphics card.

With all the excitement and hype surrounding VR hardware it is easy to see why the content creation technology has not received as much interest. 3d Graphics engines are however the primary enablers for VR and currently Unity and Unreal Engine dominate this space. The ability to render entire scenes at near photo-real quality so they can be inhabited and experienced is tantalizing.

VR Development

A curious aspect of VR is how it regularly betrays our expectations whilst providing experiences we did not plan for; it is counter intuitive. As artists we are used to controlling the medium, framing the image, using composition and layout to tell a story. There is great value in being able to contrive content like this, hovering between true representation and artistry. The same can be said of animation, where we build narrative with the same techniques through time. VR is different; there is no framing or composition or timing in the conventional sense. At It’s most basic VR is about placing the user somewhere else – anywhere at any time. Beyond that there are no limitations to what can be done which makes it, in essence, capable of anything you can imagine. With so much potential it is important to provide some framework for the medium through the following areas:

VR sickness and locomotion

At its heart VR is an illusion; it is a trick of the mind pulled off by having almost no latency anywhere in the system being used. From the moment we move our head the headset records a new position based on previous predictions, predicts a few positions into the future, makes a call from the rendering engine, which in turn draws a new image on the screen – this whole process needs to take less than 20ms to be effective, from motion to an updated image. For the most part VR sickness has been eliminated through good hardware design so we must ensure it doesn’t return in our content. Most VR sickness occurs when there is a difference between what our eyes see and what our balance system feels (the vestibular system). For this reason one of the first areas of research has been how to move around our virtual worlds. Using a normal joystick or game controller doesn’t work because whilst you are moving in VR the balance system doesn’t feel it and nausea sets in. Teleportation has therefore emerged as the de-facto standard for locomotion, and together with physical movement allows us to explore huge worlds with ease.

Scale

Most people think about VR as an analogue to the real world – being in a virtual house or room or street with everything rendered at life-size. Scale is an essential part of the design process, and traditional physical models are painstakingly built in order to understand different aspects of our designs. Viewing a building as a 1:100 model whilst seamlessly switching between scales so that you can communicate everything from details to a masterplan with one simulation is extremely powerful. Scale is particularly important in VR because the stereoscopic design of the headsets underpins this primary illusion, and so can be utilised with great effect.

Design development

Working in real-time opens the lid to an extremely agile and flexible design development toolkit, where 3d content can be made and experienced simultaneously, both on-screen and in VR. In this way VR and real-time combine into a process driven medium that permits unhindered iteration for refining a design.

Testing furniture layouts in a given space for example would mean standing in that space and switching between them, whilst moving around to make a definitive assessment. VR in this way provides a platform in which people can participate and refine a project rapidly. The explicit nature of VR makes decision making much less ambiguous then when reviewing an image and because everything can be changed instantly there is no bottleneck for updates.

Presentation

Much has been said about the headsets being an issue for presentation; They are unsightly, cumbersome or nerdy – people feel isolated when wearing them. The fact remains that VR is a very private experience and even with some of the social VR platforms and experiences coming to market, the sensation of being alone in VR persists. Rather than fight against this we can play to its strengths; VR is the perfect medium to hold someone’s attention and we can capitalize on the users focus and use immersion to engage people emotionally. In many ways this is the primary goal of VR: to captivate our audience whoever they may be, so that when the headset comes off the message has successfully gone in!

User experience

The final layer to the VR development stack is the ‘user experience’ and to what extent this enhances the content being shown. At its most basic level VR is frequently a matter of loading a 3d model and providing a teleportation system for navigating around. The user experience here is simply a navigation mechanic, and in many cases this is optimal. Alternatively we can design a passive experience where the movement through the model is scripted so the user doesn’t need to do anything. In both cases we are providing basic navigation but the user experience is profoundly different. In this way small changes to how we engage the user, perhaps through a user interface or other methods can have a huge effect on how the content is experienced.

Conclusion

The future looks exciting for VR and as the next generation of headsets start to take shape developers are beginning to build incredible content. Beyond the familiar roster of games, demos and experiences there is an enterprise VR community busy building tools for the future; Therapy VR, training simulations, remote interaction systems, and VR visualisation to name a few. This expanding development ecosystem lies at the heart of the technology and will be the key to its success as we march inexorably towards the Matrix. Exciting times ahead.

Doing is better than looking – push that wall out wider.

Virtual reality might finally be becoming accepted. This year saw the launch of various VR headsets, mainly Oculus Rift and HTC Vive, allowing fully immersive movement in a virtual environment running from a PC. Navigation in Oculus is by standard game controllers such as for an Xbox, or in Vive by physically moving bet­ween a pair of sensors mounted up to 5m apart. The latter obviously gives a greater sense of engagement and involvement; your movements in the physical world replicated in the virtual.

There are also less expensive, unwired versions into which you must slot your phone, such as the Samsung Gear VR or Google Cardboard, which essentially play back a 360° image or video. The main difference between these and the wired viewers is that they are single nodal point viewers, so while you can interactively turn your head you can only move through the environment by transporting to another node.

Within the architectural community uptake of real time rendering has been slow – partly due to poor materiality, textures and lighting. But with qualitative advancement of gaming engines, notably Unreal (UE4), the potential is fantastic. Reducing overnight render times to a fraction of a second (real time output is 60+ frames per second), screengrabs of full resolution marketing images and animation can be outputted as real time by-products.

“The ability to use movement and gestures to shape the virtual environment around you is tantalising. Modelling while ‘inside the model’, pushing volumes and walls and openings around to sense instantaneously how that alters the space, feels like a gamechanger.”

We’ve been using UE4 for a while now, most notably for London Business School’s new premises. We were restoring (virtually) one of its rooms recently to create a fundraising image by building a 3D model, rendering and then creating a ‘traditional’ interior CGI. Once complete, although it looked great, the wide angle view just didn’t feel like the physical space we’d visited. But when the same 3D model was put into the headset it felt spatially identical to being in the room. How our minds interpret 2D wide angle photography vs perceived reality is a massive subject but the effect speaks for itself; VR feels so much more real. Not only is the quality a vast improvement, but the experiential possibilities are much more believable. The word ‘immersion’ is used a lot – rightly so.

VR’s strength has yet to be fully appreciated by architects, alongside the Vive’s ‘paddles’ which allow the user to interact with the space. The use of a mouse/tablet is so ingrained that it’s difficult to imagine using another interface, but the ability to use movement and gestures to shape the virtual environment around you is tantalising. Modelling while ‘inside the model’, pushing volumes and walls and openings around to sense instantaneously how that alters the space, feels like a gamechanger; VR in VR.

The wider potential is mind-blowing. Imagine combining your own immersion with others, where all put on headsets and meet in a virtual room – a form of 360° immersive Skype. Soon we will be able to walk around the pre-application scheme with all the design team in a shared virtual environment, discussing and altering the design to see the implications in real time. We are as close as we have ever been to achieving the state of lucid dreaming…