Categories

Archives

Elijah and I (of ShowStages) were invited to add video to the Expanse Cabaret event at New City Legion for this year’s Expanse festival. The variety show was part of the Expanse Movement and Dance festival, Presented by Azimuth Theatre.

The cabaret event featured a high-energy performance by Mark Mills, with a variety of dance and dramatic performances throughout the evening.

Setup

For outputs, we had access to New City’s existing gear, plus our own arsenal of projectors and equipment. This worked out to be:

Two HDTVs installed on the bar and above the pool table

A projector and screen mounted on one wall

Two of my projectors (including a new Dell 5100MP I found on Kijiji)

Eli’s Epson projector

With that many outputs, we needed to stretch the capabilities of our two MacBook Pros, with just one output each. On my system, I ran our TripleHead2Go to power two projectors pointed cross-shot on the stage. The third output powered a projector near the bar that beamed a steady stream of tweets across the bar top.

On Eli’s machine, we ran the projection screen straight out of the MBP, and used my USB DisplayLink to get an extra out. We mirrored this channel to power both TVs with the same content. The USB adapter performed like a champ, with no glitches or noticeable lags with the content we were giving it.

We ran VGA for all sources, which took the bulk of our setup time. Hopefully we can work towards cutting this time down with VGA extenders or a road case rig.

Software

We really wanted to put our MadMapper rig to good use on this show. The angle that we projected the Twitter bar from was really awkward: a steep upside down diagonal. It wasn’t ideal, but it was a convenient placement and got the messages across. MadMapper got it setup in no time at all.

The other two projectors beamed onto each side of the stage. I mapped out a few amplifiers, the walls, and some picture frames we whited out with a few old posters. The response was great, with one person even asking if we had mounted screens in the walls. The low light combined with the decent res of the Dell made for a great map.

For playback, Eli took to our friend Isadora, running the TVs and screen as two stages. For a portion of the event, we were asked to include a remote performance by the Good Women Dance group, which was beamed in over FaceTime as they performed pieces in various locations along Whyte avenue. To maintain a clean look for the screen, we used a MaxMSP patch that siphoned the output of FaceTime into Isadora. It was a bit clunky, occasionally flashing the wrong portion of the screen when FaceTime auto-rotated with the remote iPhone. (We later found an alternative in VDMX, later…)

Originally I wanted to use Eli’s license of Resolume Avenue for playback on my computer, but not having too much experience with it, I opted to use the last few days of my Millumin trial for the event.

It was a real winner for its ability to arrange different setups and recall them quickly, while also being able to arrange the layers into 6 different media positions in MadMapper. It was also great at normalizing different content sizes, all while keeping the Twitter bar running on the side. It even brought in a live video feed from an old Mini DV camera I brought on a whim, which turned out to be a killer effect.

Where Millumin fell down was its inline effects (or lack thereof). There are a few built-in essentials, like color adjustment and soft crop that are great for everyday use. However, they aren’t easily adjustable on the fly for multiple layers unless you MIDI map each layer. Millumin’s author has assured me that custom effects will likely be in its way, as custom Quartz filters could be fairly easily implement in the video pipeline.

Callback

The wicked Dart Sisters of Catch the Keys Productions asked us to stick around for the second event of the Expanse festival Saturday night, featuring amazing spoke word artist C. R. Avery.

With a chance to do it all over again with a bit of a sense of what helped with a live video performance, I decided to grab a license of VDMX for the second night. I had used VDMX years ago before they redesigned it, and knew how powerful it could be for a setup like the Expanse Cabaret. The great folks at VIDVOX were prompt in getting me a student discount, and I was up and running for sound check.

I had some trouble getting the video layers to the right size and crop method that Millumin did so easily, but I marched into the darkness with VDMX and some revised video mapping areas for the new sets. It was a bit of a struggle to get where I wanted to go with VDMX, mostly because I hadn’t established much of a “rig”, which VDMX users are famous for building.

By the end of the night, I had discovered some need built in filters and effects, and was mixing in live video like I had in Millumin. Although we had no FaceTime to syphon in this night, VDMX has a great window capture system that allows you to grab any open (or hidden) window as a video source.

Next Steps

I’m looking forward to experimenting with VDMX, and building a custom rig for future events like this, and possible theatre applications.

With some great effects achieved with live video layering into our content, we’ll definitely be playing more with that.

We mostly used stock VJ clips for this event, but finding and creating clips for this purpose is definitely an area we’d like to look into. We had our kinect hooked up for the event, but found that it wasn’t always effective in a projection mapped environment. There may be others ways in which it can be applied though.

With the success of the twitter bar, we’ll be definitely seeing what other “set and forget” interactive experiences we can add to evening events.

Last May, Erin Gruber, Elijah Lindeberger and I began working together on some really great theatre projection and video projects. Although it has been no secret, I’d like to formally announce ShowStages on my blog, and invite everyone to subscribe, like, and follow ShowStages as we continue to pursue new interactive video opportunities.

ShowStages is a video and design collective. We build narratives through projected media and interactive audio-visual experiences. We work in theatre and new media.

This summer I had the opportunity to work on a interactive video/theatre project entitled DELETE. The brainchild of Stefan Dzeparoski, DELETE tells the story of Konstantin, a man trapped within a digital world. Erin Gruber served as the media designer, and I served as a media consultant. The Canadian Centre for Theatre Creation provided funding and organized the project, which ocurred and was presented in the University of Alberta Second Playing Space. The presentation was incorporated into the StageLab experimental theatre series at the end of August. Here’s what the CCTC wrote about it:

Delete examines the theme of what Dzeparoski calls “the eternal immigrant”. Its story depicts a man named Konstantin attempting to come to terms with a past that haunts him during the last hours before the end of the earth. Gifted with a perfect memory, he lives in a world of the not-too-distant future where everyone has come to rely on digital technology as a repository for past events. He too, has rejected his own memory in favour of the more conveniently selective digital version, but he cannot escape his gift and finds himself compelled to confront his loved ones before the clock runs out.

Beginnings

Initial mapping of scenic elements

Our first workshop in May 2012 explored the technologies we wanted to experiment with, together with the dramaturgical and visual thoughts we wanted to include. Our goal was to have ideas we could move forward in a week-long workshop process in August.

We played with the Microsft Kinect sensor using a variety of tools, determining that NI Mate provided the simplest and most flexible output out of the various software kits. Masking textures using a silhouette output was particularly intriguing, using a silhouette output from NI Mate into Isadora via Syphon. This could then be aligned to the actual human form in front of the Kinect, projecting the virtual Konstantin in and around the physical character on stage. Green screen footage shot of the character served as our source video, along with a collage of other images collected in a Dropbox before the workshop weekend.

We also experimented with flame and alphabetic imagery mapped to physical hands using skeletal data into Quartz Composer. This too was mapped to the human figure, which could wave around fire and blow letters into the screen.

Projection mediums were also explored, both using both traditional screens and draped fabric, which would inspire the final design in August.

Equipment setup

When we reconvened in August, we had full access to the University of Alberta’s video equipment. Key components included:

This accompanied a full sound setup from Matthew Skopyk, and my own MacBook Pro, linked on a gigabit network.

Video shoot in Corner Stage

Much of the video content needed to be produced before we could start experimenting on stage. We spent a day shooting in different locations on green screen and in real space. We shot video on a Panasonic HD camcorder straight into Final Cut Pro, logging video clips as we went. The batch export facilities allowed us to quickly split audio & video, key out green screen in batch, and convert them into Photo JPEG, an optimal codec for Isadora.

Tip: If you’re looking to key (green screen) out a number of clips at once using Final Cut, there’s an interesting trick to doing this. First, drag all the clips you want to process into a sequence. Apply the Key effect onto all the clips in the sequence, using Copy & Paste Settings… or similar. Then, drag these clips out of the sequence into a new media bin. These are now new independent clips with the effects applied to them, and can be Batch Exported out of Final Cut in the traditional fashion.

AirBeam

One discovery that we immediately incorporated into the project in August was the AirBeam wireless camera software. An app for iOS & OS X, AirBeam allows video to be streamed over the network to any Mac, iOS Device, or supported Web Browser. On the Mac side, the client can enable Syphon throughput, enabling a quick and simple low-latency on-stage video source straight into Isadora or any other Syphon-enabled video application. It’s completely wireless, and lasts at least an hour on battery.

The final scene when a processed live AirBeam image evaporates in the background

This proved a hit for the project, and was used throughout the show. My iPhone was placed in different positions, handheld and on set pieces. I wrote a custom Quartz/Isadora patch called AirBeam Controller that allowed us to enable/disable the flashlight, select front/back camera and more. We dropped the actor in each Isadora scene with the right settings, and the iPhone was always in the right configuration when we needed it.

One thing we did notice was that the battery drained quite quickly on the iPhone. I had an external battery pack, but still we needed to make sure it was being charged when on breaks during the workshop process. I suggested to the developers of AirBeam that a battery meter be integrated into the monitoring software for this exact purpose, which they gladly accepted.

We also placed Erin’s iPad in the grid of to use as a bird’s eye view of the scene. This provided a second AirBeam camera source into the setup at some points during the performance.

Lady Isadora

Being relatively new to Isadora, I definitely learned some hard lessons about the architecture of the program in developing our setup for DELETE. We mostly relied on the “classic” actors in Isadora, as they provided convenient access to the 3D Quad Distort actor, an essential for mapping video elements onto the hung “picture frame” elements in the physical space. We had access to the Core Image actors, but they didn’t provide an intuitive enough way to map video elements to our set.

Isadora in use

Some of the classic actors, however, proved extremely CPU-intensive for the number of outputs and mappings we were performing. We diagnosed some of the delays in processing, but even still, Isadora had a hard time keeping up. I learned that the Isadora video core is largely CPU based, as it was designed in a time when that was the best way to achieve reliable effects on different platforms. As a result, all video processing must occur within a single thread through the CPU before being displayed. Multi-threaded video playback is a beta feature, but was not stable enough for our needs.

This dropped our framerates down to 8-12 fps, which wasn’t acceptable. We optimized scenes where we could, however we ultimately needed to render out several scenes using the built-in Isadora facilities for our performance to improve playback.

In the future, I would definitely look into better options for video mapping (I’m already playing with my own copy of MadMapper these days), and also make sure that the video actors we are working with are reliable and low CPU-usage. Core Image actors are obviously ideal, but they don’t always provide matching functionality to the Isadora ones.

Process

One great thing about working with Stefan on this project was how important the video elements was to the story. We easily spent 75% of our time working through visual effects, video story telling, and experimenting with new technologies. This allowed the video to truly serve as a dramaturgical element, something that is difficult to achieve when the time and ressources are not as abundant in an ordinary theatrical production. It certainly serves as a great example to me of how effective video can be when it is part of the development process from all angles.

Erin Gruber demonstrating Kinect triggers

Matthew Skopyk provided the sound design for DELETE, who eagerly integrated audible elements with our visuals. We provided Matt with all of the audio tracks to the video we recorded, allowing him to process and modify them. We then triggered the sound cues over OSC from Isadora, playing back the tracks in Matt’s Ableton Live setup. In May, we also experimented with synchronizing intended audio “glitching” triggering simulatneous video “glitching”, to create the illusion that virtual Konstantin was fading away into the virutal world.

Outcomes

The single performance of the show attracted a large audience into the SPS, who responded well to the 25 minute piece. It was great to use the Kinect & AirBeam on stage in such effective ways and with a team who was genuinely interested in integrating it into the piece.

The project served as a catalyst to the creation of the ShowStages video collective, consisting of Erin Gruber, Joel Adria, and Elijah Lindenberger, interested in pursuing these kinds of projects further.

Be sure to check out more images from the project in the gallery below, and listen to the podcast episode recorded immediately after the show.

The final scene when a processed live AirBeam image evaporates in the background

One of my favourite new iOS Apps of late is AirBeam [$3.99], a simple way to transmit video from your iPhone, iPod, or iPod’s built-in camera to a computer, another iOS device, or with the companion Mac OS X app, into Syphon. Performance is very good, with low latency, high frame rate, and bonus features like remote video recording and remote iPhone flash on/off. Add in Syphon functionality, and you have a low latency go-anywhere camera ready to be used in the video tool of your choice.

I used AirBeam on a few shows this summer, (DELETE & The Haunted Reel), the latter using it as a primary part of performance. In the former, we used Isadora to manipulate content quickly, but needed a simple way to control the two iOS devices we were using as AirBeam cameras.

Enter AirBeam Controller. This simple Quartz Patch can be dropped into your Isadora Quartz directory to control any AirBeam device on your network. It works by sending simple HTTP requests to AirBeam’s built-in web server using the the Load Web Page QC patch. This might not be the most direct or efficient route, but it definitely works and performed well in our show.

Installation

AirBeam Controller in Isadora Core

Isadora Core

Unzip AirBeam Controller zip archive.

Copy .qtz file to ~/Library/Compositions/ or /Library/Compositions/ or /System/Library/Compositions/ (Your account, or the entire computer. Go to Isadora > Preferences… > Video to make sure you are scranning those locations.)

After a bit of a late start in August, Bearbook is back in full force for the Fall semester.

We’ve made some big improvements to the backend of Bearbook, which will make new features quicker to appear, and course information more detailed and reliable.

Appearance

We’ve updated the interface with a new navigation menu and upload button. We’ve also tidied up the timetable information, and allowed you to see who’s available in your breaks right inside the timetable!

Search

Quickly find anything on Bearbook with the new search bar. Find classmate timetables, professors, and soon: courses and textbooks.

Course Info

Thanks to improved access to the University course database, we are now able to provide better course info. Stay tuned for more features in this area.

Magic Table

Building timetables on Beartracks is a big pain. MagicTable, an experimental feature, makes it a breeze. Type in the courses you need, and Bearbook will provide you with several different timetable options. Use the visual guides to find a schedule that fits your lifestyle: morning classes, evening classes, short or long school days. It even shows you which friends you would be matched up with!

Remember, MagicTable is an experimental feature. It might not always work perfectly. Consult the University Calendar or an academic advisor if you aren’t sure about a particular course option.

As always, Bearbook Books is still available and the classic ability to find the friends in your classes is still there. So what are you waiting for? Upload your timetable to Bearbook now, and get connected!

Via Flickr:
Photos from Floating Stones Theatre’s production of Echoes, at the Edmonton International Fringe 2012.

T. Erin Gruber, a collaborator of mine asked me to shoot a few photos for Echoes, a project that used projection as a narrative dramatic feature. Check out some of the stunning visuals that came out of it!

More info about the show:

"There is a new story to be told…" The trickster. The protagonist. The helping hand. Stolen relics and magical lands

Playing around with 3D lighting in Quartz Composer. Shadows are rendered somewhat lamely, but still very cool for any sort of experimenting, prototyping, or less critical interactive components. In this demo I was testing out my custom Quartz nodes (Editor -> Add to Library ) for my Korg nanoKontrol MIDI control to manipulate parameters on the fly. Apologies for the crappy audio, my computer fans were blasting and I was too lazy to plug in a mic. Recorded with the most excellent Syphon Recorder.

In preparation for Wave’s deployment to the Edmonton International Airport this month, I needed to do some research on the WiFi there before we launch.

To get there, I decided to take advantage of a new ETS-EIA transit partnership that runs a city bus between Century Park and EIA about every hour. The aptly named Route 747 is equipped with luggage racks (shown here), and costs $5 each way. Since that is about the cost of parking at the airport for an hour, I figured this would be a reasonable alternative for my research excursion.

It was extremely convenient, taking about 18 minutes each way by my account. I would definitely consider using it for future flights. There’s no need to worry about road conditions or traffic, you just get on and surf the web and before you know it you are at the airport! No more airport parking shuttles, or parkade difficulties. Drop off and pickup are both on the arrivals lower level, so you’ll need to take the escalator up if you’re heading out, but that isn’t too big of a deal.

As an added bonus, the airport bus also includes free WiFi. It appears to be provided be Rogers Wireless. Not terribly quick, but certainly handy!

I joined Elijah Lindenberger & Erin Gruber to setup and create a projection atmosphere for the evening. Held at the Artery, we spent the day setting up four different projectors around the venue. One helped showcase some of Erin’s artwork, which was powered by my little BenQ paired to her iPad 2 with VGA adapter. A second pointed onto a side wall for experimenting with projection mapping onto the oversized clock & ghostly cellophane figures we inherited. Erin used Isadora on her 27″ iMac to power the mapping.

I setup my MacBook Pro with the Kinect, creating a silhouette image on another wall. We experimented with placement of the Kinect, but ultimately found we were limited by the stock length of the USB cable. It still turned out well, and I threw in a bit of Twitter magic using Quartz Composer to create the “creepy wall”. Originally I had it configured to download #nextfest-related Twitter pics right in the composition, but it turned out to be buggy. I instead opted to use a temporary folder that QC scanned periodically, and then saved images manually to it throughout the night. That’s one advantage of live video performance vs installation!

Finally, Eli setup his MacBook Pro with the fourth projector used as a backdrop to one of the performance stages. Using his mad VJ skills and a custom Isadora setup, he kept the cool visuals flowing right until the end of the night.

This was my first Nextfest, so it was great to get out and take part in such a great local art scene! The openness to the different types of performances and styles was tremendous, and made our experiments feel tremendously valued. I shot a few photos of the performers, they were all exceptionally great!

If you’re interested in adding a bit of projection flare to your next event, our trio would love to be a part of it. Get in touch through the usual means (@jole, jole.ca/contact, flickrmail, whatever floats your boat).

Nextfest has a bit of a contest on, encouraging shutter bugs to take photos during the festival and submit them for inclusion in the 2013 program, with credit. Pretty awesome!

It was a bit of a challenge balancing the exposure of the performer with the projections, and some of the shots I regret locking focus on the screens. Grain was a challenge on my little D40 in such a dim environment, but some of them turned out ok! Here’s the best of the shoot.