‘visualising performer–audience dynamics’ spoken paper accepted at ISPS 2017, the international symposium on performance science. this is doubly good, as i’ve long been keen to visit reykjavík and explore iceland.

need a hit-test for people orienting to others. akin to gaze, but the interest here is what it looks like you’re attending to. but what should that hit-test be? visualisation and parameter tweaking to the rescue…

with the visualiser established, it was trivial to attach the free view camera to the head pose node and boom!: first-person perspective. to be able to see through the eyes of anyone present is such a big thing.

of course, aligning the virtual camera of the 3D scene with the real camera’s capture of the actual scene was never going to be straightforward. easy to get to a proof of concept. hard to actually register the two. i ended up rendering a cuboid grid on the seat positions in the 3D scene, drawing by hand (well, mouse) what looked about right on a video still, and trying to match the two sets of lines by nudging coordinates and fields-of-view with some debug-mode hotkeys i hacked in.

third gig: angel comedy. again, an established comedy club and again a different proposition. a nightly, free venue, known to be packed. wednesdays was newcomers night which, again, was somewhat appropriate.

the second gig of our tour investigating robo-standup in front of ‘real’ audiences: gits and shiggles at the half moon, putney. a regular night there, we were booked amongst established comedians for their third birthday special. was very happy to see the headline act was katherine ryan, whose attitude gets me every time.

getting a robot to perform stand-up comedy was a great thing. we were also proud that we could stage the gig at the barbican arts centre. prestigious, yes, but also giving some credibility to it being a “real gig”, rather than an experiment in a lab.

happy few days bringing-up a visualiser app for my PhD. integrating the different data sources of my live performance experiment had brought up some quirks that didn’t seem right. i needed to be confident that everything was actually in sync and spatially correct, and, well, it got the point where i decided to damn well visualise the whole thing.

hacked some lua, got software logging what I needed; learnt python, parsed many text files; forked a cocoa app, classified laugh state for fifteen minutes times 16 audience members times two performances; and so on. eventually, a dataset of meascollect audience response for every tenth of a second. and with that: results. statistics. exciting.

a major part of the analysis for comedy lab is manually labelling what is happening when in the recordings. for instance, whether an audience member is laughing or not – for each audience member, throughout the performance. all in all, this adds up to a lot of work.

“Hello, weak-skinned pathetic perishable humans!” begins the stand-up comic. “I am here with the intent of making you laugh.”
A curiously direct beginning for most comics, but not for Robothespian. This humanoid robot, made by British company Engineered Arts, has the size and basic form of a tall, athletic man but is very obviously a machine: its glossy white face and torso taper into a wiry waist and legs, its eyes are square video screens and its cheeks glow with artificial light.

Robothespian’s first joke plays on its mechanical nature and goes down a storm with the audience at the Barbican Centre in London. “I never really know how to start,” it says in a robotic male voice. “Which is probably because I run off Windows 8.”

The performance last week was the brainchild of Pat Healey and Kleomenis Katevas at Queen Mary University of London, who meant it not only to entertain but also to investigate what makes live events compelling.

As we watched, cameras tracked our facial expressions, gaze and head movements. The researchers will use this information to quantify our reactions to Robothespian’s performance and to compare them with our responses to two seasoned human comics – Andrew O'Neill and Tiernan Douieb – who performed before the robot. […]

getting a robot to tell jokes is no simple feat. programming and polishing a script for the robot to deliver is challenge enough, but trying to get that delivery to be responsive to the audience, to incorporate stagecraft that isn’t simply a linear recording… now that is hard. of course, in the research world, we like hard, so reading the audience and tailoring the delivery appropriate to that is exactly what we set out to do.

“good evening ladies and gentlemen, welcome to the barbican centre. “comedy lab: human vs robot” will be starting shortly in level minus one. part of hack the barbican, it is a free stand-up gig with robot headlining.“