The LIDAR dataset produced by AOC Archeology was over 40 million point model of the Church, that is the home of the National Center of Early Music. Using MeshLab, this was reduced to around 500,000 points, so it could work efficiently in Trapcode Form without memory errors, it also created the 'x-ray' aesthetic of the animation, by having the point cloud resolution low enough to see through the walls and columns. To produce the equirectangular video output for 360, the plugin Mettle Sykbox studio was used in After Effects to render the Trapcode Form object. This plugin created a 6 camera set up to produce a cubemap, that then could be used to produce the unfolded image.

Utilising the same Trapcode Form asset, a sound reactive version was also developed as a visual response to Jez Wells's 16 speaker audio installation that was also part of the Vespertine Event that evening.

On the evening Google Cardboard Headsets were used with a BYOD expectation of the audience. This worked out roughly 50/50, that people had a smartphone or tablet device capable of playing 360 Youtube video. Part of the event, was also the audience constructing their own Headset from the pre-cut kit, which they could take away afterwards... Ink stamps and other art supplies were at hand to customise their Cardboard headset, which is something I feel needs to be exploited further in future projects, as there is value here for audience's investment and addition to the creative conversation.

The experience of viewing the LIDAR data whilst being in the same space, was an interesting experiment. Usually the expectations of these videos is escapist, to take you out of the current space to somewhere new. The emergent play of people using the headset, then looking around the church interior and back again to the animation, was slightly unexpected but welcome. The original conception was that this would be watched in the reception area, and then people would filter into the church's main hall. Anecdotally the audience would give feedback that it made them see details of the space that they hadn't focused on before, particularly the rafters in the roof. This cheap 'Mixed Reality' is something I want to focus on in future projects - like AR, using the technology to add to the immersion/narrative of the space they are already in.

Comments

Post a Comment

Lecturer at University of Salford teaching BA(Hons) Animation.
Her practice is Moving Image based, working with Performance and Stories - across screen and interactive media. She is also interested in the applied use of Creative Technologies.
Annabeth is also a member of the Leeds Creative Timebank - where she supports documentation of community and arts projects in the region.
Her work has been shown Internationally, notably as an Artist based in the virtual world of 'Second Life', where she is known as Angrybeth.
She also uses other Game Engines and 3D Virtual Worlds for educational purposes, and consults and helps design projects which use of game technology. Currently she is doing a collaboration with Leeds University's Earth Science Dept - to create Serious Games for Geology.
Her Real world website is at Annamorphic.net