"Live Coding" at V2_ Test_Lab

Practices, ideas, and environments: an interview with Michel Van Dartel and Artem Baguinski by Silvia Scaravaggi.

V2_ is internationally renowned as one of the most active places for new media art practices and research. You
recently presented the event Test_Lab:Live_Coding featuring debates,
presentations, performances on Live Coding. Which kind of activity does
the Test_Lab support and organize?

[MvD] The core idea of
Test_Lab is to create an informal setting, or platform, for artists and
developers to "test" their latest artistic Research and Development
(aRt&D) work. For each edition, we invite artists working within a
specific aRt&D theme to bring their prototypes to Test_Lab for a
live demonstration and, preferably, for the audience to try out out and
play with. In this sense, Test_Lab always works in two directions:
Artists use the Test_Lab audience as a "test panel" to receive feedback
on their ideas and work before it enters the museums and festivals or
before moving on to a next development stage. At the same time, the
Test_Lab audience learns about the latest in artistic R&D and has
buckets of fun trying things out themselves. Regarding the format of
test_Lab we like to experiment a lot, and let the format depend on what
we'd like to show or who we'd like to invite. In the past, we have done
things like taking our audience out onto the street to play games; had
concerts, Second Life performances, dj performances, and audience
jam-sessions; relocated Test_Lab to the Erasmus Medical Center; and
included fashion shows, theater, and various mini-workshops in the
program. For each edition of Test_Lab we develop a different theme, and
with those themes we try to tap into what we think are important
developments and issues in current aRt&D. Moreover, the themes come
forth from discussions in the V2_ Lab; questions that arise from
projects that we are working on, technologies or approaches that we
think are interesting, or things that we would like to learn more about in
view of an upcoming project at V2_. In past editions, we have focused on
broad themes like technologies for the performance arts, technology in
fashion, and the notion of play in electronic art, but also on more
specific topics, such as physical audio interfaces and, most recently,
Live Coding.

How is V2_Lab involved on Live Coding experience and how did you arrive at this kind of event on Live Coding?

[MvD]
Test_Lab: Live_Coding is a typical example of how a Test_Lab theme
evolved from V2_Lab discussions. For years, Artm has been involved in
Live_Coding and has been bugging the other Lab members with info on the
software paradigm and how it is revolutionary in its approach. But it
was only until we recently encountered some difficulties in the
development of one of our projects, a collaboration between Rnul, Carla
Mulder and V2_, that we decided to turn it into a Test_lab theme. The
problem that the project team encountered had to do with making an
Augmented Reality environment more dynamic, so that it could be applied
to improvised theater performances. Artm, logically, offered Live
Coding techniques as a possible solution to the problem, and from there
on we decided to present the project, and its first steps towards a
Live Coding solution, to the Test_lab audience within a context of
other Live Coding projects and performances. In this way we hoped to
receive feedback that would be beneficial to the project's further
development and, at the same time, to introduce the audience with the
Live Coding software paradigm.

What did "Test_Lab:Live_Coding" event feature?

[MvD]
The evening was opened by Florian Cramer of Piet Zwart Institute who
gave a very clarifying and entertaining presentation on the history of
Live_Coding. Besides introducing what is understood by Live Coding, he
also related it to the philosophy of early free jazz and worked his way
from there, through early electronic composers, towards the
groundbreaking Live Coding work of TOPLAP. After Florian's
introduction, Live Coding audio performance collective Powerbooks
Unplugged introduced their approach and, following, gave a beautiful
unplugged performance with their Powerbooks, positioned spread-out
through the audience. After PBUPs performance, all chairs were taken
out to make room for a combined presentation by interaction developers
Rnul, theatre maker Carla Mulder, and V2_Lab's Jan Misker and Jelle van
der Ster. They demonstrated the result of a week of trying to apply
Live Coding techniques to an Augmented Reality environment. First they
handed out 3D glasses to the audience, so everyone could see the 3D
projection of the Augmented Reality installation, and then Carla Mulder
presented what they had developed so far and how Carla was planning on
using it in her theatre performance. In fact, Carla's presentation had
a lot of a theater performance in itself, making it a perfect test of
what had been developed during the week before. Following was a
parallel session in which the audience could choose to attend either
one of the mini-workshops on Live Coding programming languages
SuperCollider, lead by PBUP, and Fluxus, lead by Artm Baguinski, or to
play with the Augmented Reality installation. The evening was closed
with drinks and a very dance-able audio performance, based on Live
Coding environment MAX/MSP, by Susie Jae, aka Jean van Sloan.

How did you, Michel and Artm, cooperate to this event?

[MvD]
Although I curated the program, Artm was definitely the main source of
information during the event's preparation. First he gave me a personal
introduction course to Fluxus and then directed me to a collection of
relevant papers and websites on Live Coding. From there on, Artm
focussed on preparing his contribution to Test_Lab, the Fluxus
mini-workshop, and I got in contact with Florian Cramer and some TOPLAP
people; Alex McLean, Dave Griffiths, Adrian Ward, and Julian Rohrhuber,
to further develop the program. Although not all of them could make it
to Rotterdam for the event, they were a big help in setting up the
program by getting me in touch with the right people. Of course, with
any decision regarding the program I consulted Artm's expertise on the
topic.

Which were the emerging themes during the presentations and the debates?

[MvD]
It's funny that you call them "emerging" themes, because the evening
slightly surprised me in the issues that came up during the debates.
While we had set up the evening around the live improvisation aspect of
Live Coding, and I had expected much discussion on things related to
that (such as the discussion on whether actually showing the 'raw code'
during Live Coding performances is essential), the discussion got most
heated up regarding the definition of Live Coding. For instance,
according to some, what was used to make the Augmented Reality
experience dynamic in the demo by Rnul, Carla Mulder, and V2_Lab did
not fall within the definition of Live Coding, since the calls to the
visuals were performed live but the visuals were not live-generated.
According to others their approach did fall within the definition of
Live Coding, since there was no need for compiling or rendering to make
dynamic changes in the output of the software.

[AB] Live
Coding isn't always about making code and media from absolute zero:
although you definitely may start from scratch, often you'd use
pre-written scripts as well as resources - textures, fonts, samples
etc. The crucial part is how you then manipulate them - once you've
started editing the code that uses those resources and execute the new
version of the code on-the-fly and on-stage, you are coding live. In
this case, what was coded live was behaviour and not appearance of
objects, that's probably why it wasn't so apparent.

[MvD]
Furthermore, there was a lot of discussion on the differences between
using GUIs and Live Coding, but maybe Artem can explain this better ..

[AB]
I guess in my corner of the floor discussion went somewhat different
way: I found myself discussing some instruments frequently used in Live
Coding - ChucK, fluxus and supercollider and how various aspects of
their languages and GUIs help to code (be it live or not). Neither
using GUIs, nor visual programming (as in PureData or Max/MSP or
StarLogo TNG [1]) is foreign to Live Coding, it is the attitude - the
desire to make public your thought process by displaying the code
behind the media, that counts. To quote TOPLAP manifesto: "Live coding
is not about tools. Algorithms are thoughts. Chainsaws are tools.
That's why algorithms are sometimes harder to notice than chainsaws."
Some practitioners of Live Coding are constantly in search for more
tangible ways to represent, create and input algorithms then
traditional text-based programming languages / editors. Some examples
are BetaBlocker [2] and Al-Jazari [3] by David Griffiths, and Rubik's
Cube DJ [4] by Douglas Edric Stanley. This might not be approved by
TOPLAP as "official" live coding instruments, but they do show existing
interest in alternatives to plain text.

Where does live coding come from?

[AB]
Live Coding consists of programming in front of the audience, while the
program one writes is running. Most often the program being (re)written
generates or processes sound or visuals - "rich media". So, what
happens is - the performer has some sort of system that can generate or
process rich media, and the way it actually does that can be specified
by a program of some sort. The performer starts the system and opens a
program for it, or one could start from scratch, in an editor. OK,
he/she thinks, I want this or that to happen and here is how I'd
program that and he or she types in the program or modifies the
existing one and starts it up. Now the audience and the performer can
see or hear the result of the program that has just been written. It
could be that the program doesn't do exactly what was intended - due to
an error or some misunderstanding; or the program does exactly that,
but the performer now wants something different, maybe more complex,
but still based on the original idea. So he/she goes back to the editor
and modifies the program and starts it again and repeats the process in
the course of the performance again and again. Now, to make it more
interesting for the audience, care is usually taken to make the
transitions between the old version and the new seem smoother, or even
that the original program doesn't really stop, but just changes over
time. Different artists use different tools and different ways to make
Live Coding feel like a continuous coherent performance and not like a
series of experiments. In practice one doesn't often start really from
scratch - you've got some code fragments prepared, some ideas tried
out, maybe some library (code that you use without it actually
appearing in the editor).

[MvD] In the mini-workshops Jan-Kees
van Kampen (of Powerbooks Unplugged) illustrated this very clearly by
starting up a simple sine tone in SuperCollider, typing the command for
sine and some parameters between brackets to define its variables, the
amplitude etcetera. Then while fiddling with the parameters a bit,
changing the height of the tone and such, and by adding some more
commands to the line, introducing temporal manipulations on the tone
and such, the tone changes into a cute repetitive pattern. He explains
that this is the type of thing that he prepares for a performance, then
during the performance he doesn't write every sound from scratch, but
prepares some lines of code like he did just now, and simply pastes
this into the running editor. The pasting of these pre-prepared code
fragments and the fiddling with the parameters in the fragments is used
to improvise with the other Live Coders or whoever is 'jammed' with.

And how is it changing the programmer role in the creative approach?

[AB]
In the context of performance I'm not sure it does - there are artists
who can code and practice performing arts, both activities are creative
endeavours and eventually some of such artists find themselves
combining the two - by creating their own instruments or coding during
the performance or both. Rather then changing their role, Live Coding
allows them to demystify it - look, I am programming now and here is
how I do it and here is what my code looks like and here is what it
does - this all at the same time as a coherent and completely open
performance. However, when we at V2_Lab, use Live Coding practices in
an augmented reality art production process, I think the character of
our, engineers' involvement changes - it becomes much more direct,
immediate - due to the much faster feedback that we and the artists
have. Production of complex electronic installations is always an
iterative process of trial-error-reevaluation, and the ability to try
various ideas out on-the-fly, as they arise in a brainstorming /
improvisation session, transforms the engineer into a sort of organic
interface to the technology or an augmented actor - depending on the
perspective taken. How would you describe the contemporary Live Coding art scene?

[AB]
On the theoretical side: self-criticizing, self-defining,
self-searching - there is a lot of reflection going on: what
constitutes Live Coding, is it an autonomous form of art, or is it a
stylistic addition to more traditional audio visual performance? What
is the importance of making the code visible and does it matter which
form the code takes on the screen? What does it mean to the artist and
the audience? On the practical side: experimental and daring. The
theoretical discussion arise from praxis and re-evaluation of own
experience and motivation but doesn't constrain what people actually
do. You can say the theory of Live Coding, if there is such thing, is
descriptive rather than prescriptive - it analyses what's going on and
attempts to reinvent itself accordingly, rather then give guidelines on
how Live Coding should be exercised.

How can you define the kind of "interaction/interactivity" Live Coding is able to realize?

[AB]
Ideally, what we aim at, it is the interaction of the physical
environment surrounding the artist and the software system, on a very
intimate level of code constituting this very system. There are
multiple feedback loops here because the input/output of the software
system connects it to the environment, just like artists own sensory
system does.

What does Live Coding mean, on your opinion, in terms of: improvisation, live performance and interface

[AB]
Many artists or programmers make art with computer code - look at
runme.org or at the obfuscated C code contest - I consider some of the
entries there sort of code-poetry. Code based art is very holistic -
often its beauty is only apparent when seen against a background of a
certain computer subculture or language. Obfuscated code might use
unconventional formatting to represent the structure or some properties
of the algorithm the very same code implements, just like the Tale poem
in Carol's Alice in Wonderland. And just like you've got to understand
English to see the link between its content and its mouse-tail-like
shape, you've got to understand code and often be familiar with
programmers' culture, folklore and mythos, to see all the "inside
jokes", references and analogies. By creating the code on the fly and
modifying it as it runs, live coder gives the audience a chance to get
a feeling as to what it means, even if the language is unfamiliar or
text is barely readable. By following the editing actions, fixing
mistakes, hesitation and scrolling through the code, and hearing /
seeing the result of the code at the same time, the audience comes
somewhat closer to the holistic appreciation of what is going on -
since they not only experience the media output, or only see the code
behind it - but experience both simultaneously as they come to life and
evolve. Improvisation is the goal of live coding - that's why you do it
live, that's why you create and improve the instruments to get as much
out of the way while remaining useful and powerful. And in fact, the
interface has dual use: for performer the editor is the interface to
the instrument, it can constrain or empower. On the other hand, for the
audience the editor is yet another interface to the artist's mind,
compensating for the physical body often being hidden behind the laptop.

What happens between the "live coder" and the DJs, or the dancer, or his audience in real time?

[AB]
Technically the simplest but otherwise the most important connection
with other performers is the one that goes through ones brain - just
like in any collaborative improvisation you watch / listen to your
co-performers and let them influence your ideas as to where to go next.
Often you'd know upfront what you can expect from the collaborators, it
helps to prepare your own material that would work well with that. But
again - you can start blank and work toward coherence on-the-fly. But
since we've got computers, we can let them do some boring tasks,
freeing ourselves for more abstract and high level "linking". Thus when
working with a musician you could make a computer analyse the
microphone input and turn the sound into input for the code. When
collaborating with other computer performers you could send each other
signals or even fragments of code over the local wired or wireless
network.

[MvD] I think that in terms of interaction between a
Live Coder and other kinds of artists and/or the audience, the strength
of Live Coding is not so much in that it produces completely different
audio or visuals than could be achieved with standard software and
hardware... Live Coding's real strength is in that it provides absolute
freedom in the manipulations that can be carried out there and then at
that moment, and therefore has a great impact on the possibilities for
live improvisation, and thus results in a completely different
improvisation (interactive) process than would be possible using
standard software and hardware. In other words, it basically enlarges
the creative space by mineralizing the parameters defining that space.
When you use a sequencer to manipulate a tone you are restricted to the
functions of the faders and knobs and predefined procedures of the
sequencer, with Live Coding there are basically no such restriction,
and any manipulation that you'd like to do can be coded at that moment,
there and then, providing a much larger freedom to improvise. As a
musician I find this very appealing, although it seems to take a while
to develop the skills required, since, so far, i'm still only fiddling
with the parameters of simple sine tones... On the other hand, people
spend lifetimes mastering the piano, so why not spend as much time
mastering your laptop?

Artm
Baguinski works as a software engineer at V2_Lab since 2000, working
with artists in residency and on lab's own research and development
projects. The projects he's working on range from unconventional
graphical interfaces for multimedia databases through realtime video
processing to virtual and augmented reality. Artm is an avid user and
advocate of open source software and contributes to various open source
projects whenever he can with code, advise or bug-reports. For the last
couple of years Artem has been contributing to fluxus - a software tool
for live coding of visuals - and using it in spontaneous VJ/musician
collaborations. Among Artm's other interests are natural and computer
linguistics, philosophy and natural sciences.

Michel van Dartel
works as a project manager and assistant curator at V2_, Institute for
the Unstable Media, which he joined in 2005. As project manager he is
involved in a variety of artistic R&D projects at V2_Lab. Besides
that, as assistant curator, Michel coordinates a bi-monthly event named
Test_Lab -a platform to demonstrate, test, present, and/or discuss
contemporary artistic Research and Development (aRt&D)- and is
involved in other V2_ public events such as the Dutch Electronic Art
Festival (DEAF). Prior to his current appointments at V2_, Michel
worked as a researcher at Maastricht University, where he investigated
knowledge representation in robot models and from which he received a
PhD in Artificial Intelligence and an MSc in Cognitive Psychology.
Besides electronic art, artificial intelligence, and cognitive
psychology, Michel's main interest is music. He is a performer and
songwriter in The Rose Frustrates and is active as a DJ.

This text was used for an article about the Live Coding Test-Lab, publsished in the Italian magazine Digimag.

Document Actions

Test_Lab: Live_Coding will feature debates and presentations on "Live Coding", a networked code jamming performance, live coded music, introductions and hands-on experiences in "Live Coding" programming languages for code improvisation, and a live ...