Hey all! It’s Rachel again. I have another amazing Art Showcase for you. This time Neil Mendoza explains how he and Anthony Goh brought these animated bird sculptures to life with the help of a Raspberry Pi, some Arduinos and lots of old mobile phone parts.

I really love this one XD – read right to the bottom if you want to see the birds in action. Over to Neil…

Mobile phones are ubiquitous in today’s society, but often their use has unintended consequences, intruding into and changing social situations, distancing people in in real life by dragging them into the digital world. They are also a massive source of electronic waste. A few years ago this inspired Anthony Goh and me (Neil Mendoza) to create an installation that takes cast-off devices and suggests an alternate reality in which these unwanted phones and noises become something beautiful, giving them a new life by creating an experience that people can share together in person. The Barbican recently asked commissioned us to create a new flock of birds for their awesome Digital Revolution exhibition. Here’s a little tech breakdown of how they work.

In previous versions, the birds were independent, but this time we decided to have a Raspberry Pi at the heart of the installation controlling them all. This gave us the most flexibility to animate them independently or choreographed them together.

The exhibition is travelling so we wanted the installation to be as easy to set up as possible to so we decided to make each bird talk to the Raspberry Pi over ethernet. This means that communications are reliable over long distances and each bird is self-contained and only needs a power and data cable connected to it.

The next challenge to overcome was to figure out how to call a bird. In previous incarnations, each bird included a functioning mobile phone that you could call. However, as there is no reception in the gallery, we decided to include a different era of phone junk and make people call the birds with a rotary phone from the 1940s. The system looks something like this…

To make the phone feel phoney, the receiver is connected to a serial mp3 player, controlled by an Arduino that plays the appropriate audio depending on the state of the installation, e.g. dialling tone, bird song etc. The Arduino also reads numbers that from the rotary dial and if one of the birds’ numbers is dialled it sends it over ethernet to the Raspberry Pi.

The iBirdBrain app running on the Raspberry Pi is written in openFrameworks. When iBirdBrain receives a number from the phone, it wakes the appropriate bird up and tells it to move randomly. It then picks an animation created using James George’s ofxTimeline and plays it with some added randomness. The current state of each part of the bird is sent every frame over ethernet as a three byte message:

Byte 1: Type, e.g. ‘s’ for servo

Byte 2: Data 1, e.g. servo index

Byte 3: Data 2, e.g. servo angle

So the status of the app could be seen quickly without needing to SSH into the Pi we decided to use a PiTFT screen. To begin with we rendered the OpenGL output of the app to the PiTFT screen, however as the screen runs at 20 FPS this created an unnecessary bottleneck. In the end, we decided to set the screen up so that it would render the console output from the openFrameworks app. After that, the app ran at a solid 60 FPS. Outputting a '\r' character to the console goes back to the beginning of the line, so I used this to create a constantly updating console output that didn’t scroll, e.g.:

cout << ‘\r’ << statusMessage;

The birds themselves each contain an Arduino. They speak ethernet using an ENC28J60 ethernet module and this library. To start with I used TCP but running a TCP stack along with all the other stuff we were asking the bird to do, proved a little too much for its little brain so we moved to using UDP as it requires less memory and processor cycles. An ID for each bird was programmed into the EEPROM of the Arduino. That way, there only needed to be one firmware for all the birds, the birds themselves would then set all of their data, IP address, peripherals etc based on their ID.

Each bird has multiple parts that are controlled by the Arduino, servos for the wings and heads, piezo sounders, Neopixels and a screen for the face.

Escape III is on display at Digital Revolution until 14th September at the Barbican in London – I’m so excited, I’m going next week!