Tuesday, 6 November 2012

This post is an updated summary of what @twitr_janus is, and what it demonstrates

This is mainly aimed at attendees at the Museums Computer Network conference, Seattle, 2012, especially the Slack Day hacking workshop, where this cutey will be being demonstrated live. (if I can put it back together again)

Firstly, here is @twitr_janus in cabin baggage ready to be checked in as hand luggage. I am not sure it will be allowed on board the aircraft...

What is @twitr_janus?

@twitr_janus is a live physical avatar - a puppet that can be controlled remotely over the Internet using the data sources of common web services like Google Drive and Twitter and direct web-enabled services such as Skype. It is a (very) crude alpha prototype that demonstrates several simple ways to remotely interact physically using web data services or piggybacking the connectivity of audio/visual web services.

Its real point though, is to demonstrate how real-time sensory interaction can be achieved by hacking together feeds, APIs or just features of common freebie web services. It can see you, hear you, talk to you and move its eyes to watch you. That's not creepy at all.

How does this work?

To control @twitr_janus, data is sent to it over the Internet using free web tools that have interactive elements. These include sending a tweet in a specific Twitter account and creating custom forms that piggyback the form processing script of a Google spreadsheet, but many other web services could be used.

At the other end of the web, the structured data is retrieved so it can be manipulated. This is done using the open source Processing language running on a connected computer. Processing can parse (strip out selective data) from open RSS feeds and/or APIs.

The data is then converted into physical action. An open source text-to-speech library for Processing is used to generate speech from text data. It can speak either tweets from its Twitter account, or from the contents of a field in a Google spreadsheet.
@twitr_janus uses the open source hardware platform Arduino to convert other data to physical motion. Eye control is achieved by an Arduino sketch loaded on the board, which reacts to data parsed from feeds from a Google spreadsheet. This signal data controls the position of two servos that direct the focus of its stare.

Its jaw is lip-synched to speech output by converting the analogue audio signal levels into a control signal for a car-door lock actuator motor. An audio amplifer is used to create asignal large enough to detect, and the Arduino sketch triggers a relay power circuit for the actuator based on signal peaks. It also has LEDs to indicate the last source of its control data.

Finally it can see and hear via a webcam. It does this using web video software Skype. This can be remotely activated using the correct settings in Skype

WTF?

The important point is simply that almost any modern web service will use, or often be built upon, a data publishing model. Because these web services have built-in data publishing features, such as APIs (application programming interface) or feeds such as RSS, they convert the human input tools into predictable data output that can be manipulated. None of this requires you to set up any infrastructure.

This principle can be used to hack new things together. It's all very enjoyably Frankenstein. @twitr_janus is just a specimen beast, created to illustrate this with a physical example. It simply demonstrates some possibilities beyond obvious web front end stuff, of hacking data from current open source or freely available web services, to create physical interaction.

It's crude, often breaks and has cost the lives of two Arduino boards, at least five servos and one vintage glue gun (RIP), but it does show how web data doesn't have to be just on-screen.

Other things that could be quickly prototyped using this method can easily be imagined such as:

Tweetometer for marketing campaigns. Use Twitter data about your campaign tags to make a physical thermometer go up and down to indicate success. (e.g. Trend ranking, number of retweets, etc.)

Animatronic avatar interaction to query collections or events databases. If data can be extracted, it can be spoken

Physical puppet automatically announces new events as they appear in the api of your events database

etc. etc.

Free code and stuff...

(with apologies to my professional developer colleagues for the state of this hackery!)
The basic (and very rough) code is available for forking/hacking/recycling as you see fit. Note, this will be only the last working version:

There are detailed posts about the specific physical build of @twitr_janus in the archive if this blog. Physical build notes crop up in back posts here on makingweirdstuff (interspersed with other nonsense)
E.g. Posts about building eyeballs

OMFG - wild mushrooms for breakfast after strolling in the bracing cold Autumn wind with the daft dogs. Is there anything better?
Here we have creamed Wood Blewits with garlic and parsley. They are purple AND delicious - oh yes - a fawesome breakfast after dog walking in the woods...
Here's what they look like fresh from the wild...
The leaf litter here is an indication of the massive earthy smell - lovely.