Lecture and Workshop: Machines in the Landscape of PLEMENITAS for Croatian Art students

The past weekend we spent with a group of students and their professors Ines Krasić, Mirjana Vodopija and Iva Ćurić from the Academy of Fine Arts Zagreb in the idyllic village of Plemenitaš where they have one week of lectures and workshops on the topic of SOUND from different experts ranging from mathematicians and physicists to composers, musicians and artists. We are situated in the middle of this field of experts and presented our work and Empathy Swarm in the relaxed atmosphere of an outdoor evening lecture and a workshop “Machines in the Landscape of Plemenitaš” with one of our Psychophysics Machines: Megatautophone.

In a first step we sent the students out to record the sounds that express how they experience Plemenitaš, capturing the sounds of the scenery, ranging from cicadas, birds, bells of sheep to the rustling leaves in the wind. Then they could listen to their recordings through the directional speaker and learn how to modify the sound to improve it for the special nature of the directional speaker, applying high-pass and low-pass filters to get into the correct frequency range. To capture the results of the workshop and as a prototype for a series of “Machines in the Landscape”, we put the Megatautophone into the landscape, giving it the transformed sounds of the landscape as a language to communicate with its surroundings. We shot videos from different positions and used Zoom X/Y microphones next to the cameras to record this ‘dialog’ of the machine with the landscape. The videos and sounds need still to be combined and cut. We will present it soon!

Those first weeks of the residency with Kontejner have been very busy: meeting experts and new friends,
setting up our lab that we took with us from home to get the tool chain up and running again for the next generation of prototypes,
while prototype Kryten is the guinea pig for finalizing the PCB design and programming steering behaviors.

This project has been assisted by the Australian Government through the Australia Council, its arts funding and advisory body.

The work is coproduced at Kontejner within the framework of EMAP / EMARE and co-funded by Creative Europe.

We just made the next step in the development of our Empathy Swarm project: We set up a working tool chain for the production of the next robot prototype!

While our previous prototypes Robert and Robertina had a lot of cables, we now dramatically reduced the number of cables by designing and producing custom-made PCBs (Printed Circuit Boards). This does not only increase the robot’s robustness, it also reduces the production time and complexity during the assembly of the individual robot – and imagine producing 100!

The circuit is designed on the computer in form of a black and white image where all the connections are represented in black while the gaps in between are white. This image is then printed out with a laser printer onto a transparency foil. This transparency foil is used together with a pre-sensitized positive photoresist PCB, which initially looks like an empty copper plate. In order to transfer the circuit design onto this photoresist PCB, the transparency foil is fixed to it and both are put into the UV light exposure box so that all the empty ‘white’ parts are exposed while the toner of the black ink is preventing the exposure.

In our special case we designed a double sided PCB which can have circuits on either side that are connected through pins. It is necessary that both sides are perfectly aligned during exposure, which can be achieved by building a kind of bag from the transparency foils with transparent tape.

After exposure the pre-sensitized PCB is put into the etching tank until all the exposed parts are etched away while the shiny copper of the designed paths stays. When you can see light shining through from behind, you know that your board etched enough. We choose a quite thin board which we cut into shape after the etching process with a pair of scissors. The remaining photoresist layer needs to be wiped off with acetone in the end to make better soldering connections possible. Then the holes to connect both sides of the board need to be drilled and the PCB is ready for the electronic parts.

In order to reduce size and weight of the robot as much as possible we decided to become as small as possible and to use SMD (Surface Mount Device) parts. These are so tiny that it is difficult to solder them onto the board manually. That for we build a temperature controlled SMD reflow oven, which melts the parts onto the board for you instead. 🙂

We are getting there. Setting up for our Saturn3 Performance tomorrow in collaboration with Łukasz Szałankiewicz aka Zenial at Izlog Festival. The robots and the first of the three projectors are set up, just one projector for the top-down and the one for the face projection missing.

Preparing for Izlog Festival and our Saturn3 Performance on 2nd May, 10:00 pm, we are experimenting with face-tracking and real time projections onto our faces.

Yesterday night we made some first tests, using infra-red light and a filter for visible light on the camera, to track the face and its landmarks in real time, while the projection was projected back onto the face, being visible for the human eye, but not for the special camera.

It’s a first idea of how it will be when the robots take over and transform Łukasz Szałankiewicz aka Zenial into one of them during the Saturn3 performance, synthesizing sound depending on his emotions.

Preparing for Izlog festival, we just set up our working space in the gallery: Electronics are spread out, visuals projected onto the walls, the 3D-printer is printing, a lot to do, but a coffee in the sun with everybody is a fantastic break from time to time. 🙂

Currently we are working on making our Robot Swarm perceive humans. The vision sense of our robot is a camera; the robot sees its environment, but that alone is not enough. The robot needs to learn to differentiate a bit more.

Previously we used Python and could already detect emotions, but to be more flexible and have the possibility to have more control, we are porting our coding environment to c++.

With the goal to respond to human emotions the primary goal is to detect faces. In order to teach Robertina to identify a human face, we use the computer vision and machine learning libraries OpenCV and dllib.

And see what she can already perceive: That’s Adam and me and she can identify two heads and amazingly also our so called facial landmarks that play a relevant role in her future ability to actually perceive emotions!

The workshop gives an introduction to robotics, directional sound and visual control ranging from working with mechatronic systems to programming in processing and VVVV to cross-communication using open sound control.

We give an insight into our artistic practice and participants have the possibility to engage with 5 robots and use the tools we normally use to have a jam session together. This includes controlling the movements of the robots, the sound that is played and a custom made visual mixer.

When?

26. April

Duration?

~ 4 hours, start 2pm – open end evening

Format?

informal meet-up at the gallery space,

hands-on jam session: control robots + sound + visuals

Where?

Student Center, gallery

The work is coproduced at Kontejner within the framework of EMAP / EMARE and co-funded by Creative Europe.

Fritz Heider and Marianne Simmel’s analog overhead projector animations were already used in 1944 to demonstrate how easy it is for the observer’s brain to associate emotional attachment to inanimate objects and how our human brain is hard wired for compassion.

Empathy Swarm will transplant this from a digital to a physical environment by creating a swarm of robots that as a whole can adapt its behavior and interacts with the observer’s emotional feedback.