Hackers Dream Up Fresh Ways to Visualize Music

Many objects that play recorded music have screens. Sometimes the screen plays a music video, but mostly it displays helpful — if static — information about what’s playing. The possibilities for depicting music visually are much wider, given the surfeit of music data that is available for, and/or can be generated from, literally any recorded song in the world in real enough time.

So, what else should our televisions, laptops, smartphones, tablets, picture frames, and other connected home devices show on their displays or projectors while we listen to tunes?

Music visualizers used to be the exclusive concern of music nerds who would do things like tweaking the settings on a Winamp visualizer plugin to make them look more interesting. That could change as music-playing devices gain ever-faster processors, and the screens and projectors in our lives multiply and grow more connected to each other, and to the internet, with its vast repositories of data about music, artists, and fans. Data exists about individual sections of songs, the timbre of the instruments, the contents of the lyrics, even the emotional implications of songs — and that’s still just scratching the surface of the sort of data that could be visualized, either to see what’s going on within the music, or just because it looks cool.

This is not to say that we don’t need things to display song information or music videos; it’s just that these developments will leave plenty of room for new music visualization inventions, especially as manufacturers and services look for every possible way to differentiate themselves within an increasingly crowded market, and as more of the world’s population streams music.

For a hint of what’s to come if music visualization becomes part of our everyday lives, we might look to the fringes.

On the last day of May, about 60 music hackers gathered at Etsy Labs in Brooklyn to spend ten hours building music visualizations and demonstrating them to each other as part of the Monthly Music Hackathon series, organized by Jonathan Marmor, engineer for Spotify subsidiary The Echo Nest, publisher of Evolver.fm, along with a dozen other co-organizer musicians, scientists, artists, engineers, composers, and tinkerers. (The next event in this series is the New Musical Instruments Hackathon at Spotify on Saturday, July 26 in New York.)

Here’s what presenters at the Music Visualization Hackathon demonstrated, in order of appearance:

The third guy listed above is a “tech-metal guitarist” with Behold… The Arctopus, which is probably one reason why his team spent their day building an app that creates a custom music video for any song that changes on each and every beat. The song comes from your computer, the images come from the internet, and for now, as with some of the other hacks listed below, you can only use it if you can run the code.

People can control this three-dimensional visualization of any song by touch from “any mobile device,” says Russo.

It’s complicated, but basically, Russo (who projected his creation onto Torn Hawk earlier in May at Brooklyn’s Warsaw venue in the image to the right) feeds a combination of live and found video into the software, which bases 3D visualizations on the live audio as it is played, projecting it onto the player.

Out of everything on this list, Hussey’s “hybrid ink-jet printer and electric guitar” (left) is probably the least likely to impact the average music fan, for the obvious reason that most people are not looking to play an electric guitar that is also an ink-jet printer.

Most music visualizers work by moving pixels to reflect music, but this one moves people. First, the hack feeds any song into audio analyzer API of The Echo Nest to determine how its loudness, pitch, and timbre changes over the course of the song. Then it feeds the resulting script to human performers in real time on their phones, so they know how to act it out.

“Catherine and I wanted to do something not screen-based, so we built a generative theatre play,” explained Jansson. “It’s similar to an old fashioned Winamp viz, but instead of mapping audio features to colour, shape and position on the screen, we map Echo Nest features to emotion, pace, and distance from other actors. For example, each actor is assigned a pitch, and when two actors’ pitches are active together, those two actors move towards each other. When the loudness changes radically, the actors change between walking, being still, dancing, etc. The actors get their instructions in real time on their smartphones, and they don’t know the ‘script’ in advance.”

Stensland’s hack (right) conceives musical elements as genes, processing them through a “gene regulatory network” that amplifies or attenuates the influence of each musical gene to create a composition.

The most theory-laden hack on this list, Chord Diagram presents concepts like the Circle of Fifths, tone row matrices, and the piano roll, as a music visualization. A circle represents the harmonic progression of the song, with pathways connecting repeated chord sequences.

Add a song to this app, assuming it becomes publicly available, and you’ll see it morphed into a circle and broken down by section. Larger sections appear at the center of the circle, while the finer-grained section divisions appear towards the edge. Meanwhile, real time sweeps through the clock-like interface, activating each section.

Pictured at the top of this post is Kandinskify, which can potentially render any song as a moving painting in the style of the immortal Wassily Kandinsky, who was interested in coding music as symbols in his paintings, according to the group’s presentation (fascinating info).

The idea behind this hack was to see how Kandinsky might have interpreted today’s music. For now, the hack is hardcoded to work with one song: “Zo0o0o0p!!! feat. Oddisee,” by Kidkanevil, with amplitude, frequency and other elements from the song visualized in a fascinating, artistic way.

Electronic Wind Instruments resemble brass or woodwind instruments you’d typically see in a classical orchestra or jazz band, except that they sense the wind coming out of the performer’s mouth and use that information to make all kinds of sounds, in combination with a button interface for the fingers. EWI Visualization is a work in progress; it makes shapes like a dancing pyramid thing that Marmor describes as “so fun.”

“Some neat things have been done with oscilliscopes in X/Y mode,” writes Adamson, “and I thought it would be fun to do something similar and plot lissajous figures on a terminal, controllable via OSC.”