Meta

Motion

At HackMTL, I learnt an important lesson: Make something which is very obvious and easy to explain. Coalesce was interesting but so esoteric that I don’t think anyone really followed what it is that it was actually doing. So a photo mosaic seemed like a good choice. Except a photo mosaic isn’t the most compelling thing ever; and so I made a video photo mosaic instead (:

And while I did come in 4th, again (: Leila decided to award me a prize for best use of the Piximilar API. Aww.

I used the iSight camera in my MacBook as a video source for both kinds of mosaic. For the first, simpler kind, you could click anywhere on the video, and the app would show a collection of images that matched that colour. If you clicked a second time, it would show you a collection of images that were half of each colour, and so on (up to 5 colours – at which point the oldest colour is dropped).

For the second kind, it was showing the video as a constantly updating photo mosaic. Having it show a tile for each individual colour that was present would have involved making an enormous number of requests to the colour search API; so instead, I made it so that it would round colours. At the default rendering level, it would round each of the RGB hex values to the nearest 64 – enough resolution to sortof see what’s going on, but a small enough colour set that precaching it isn’t tooo onerous.

But I also made it so you could adjust the colour accuracy! with more accurate colours, the photo mosaic is less complete; but as time goes by, its cache of images can expand to include all of the colours which are shown. I didn’t quite get to making an asynchronous image loader; so for each frame of video-mosaic rendered, it would load one more colour so that it wouldn’t stall too long on a single frame.

(Uhm. It looks like there’s something weird with Vimeo embedding which is making it show the video for Coalesce twice. If you’re looking at this on my homepage, and this seems confusing, click the link under the video..)

[This was an iSight being fed into a sprite as the main image; and into a variable length queue for the echo’ed image; along with noise being fed in using a chaos factor of 0.25. Chaos factor? What’s a Chaos factor, you might ask? It pretty much determines the level of noise. Too much noise looks a bit something. A little bit of noise looks interesting. A bit something is a totally valid way to describe it.]

When I was in Melbourne, I ended up hanging out with the lovely dpwolf. Some of the stuff he knows about things I don’t, leaked out. And now I know enough to be dangerous. Muahaha.

One of the things that leaked out, was the existence of Quartz Composer. Which I have been abusing at length trying to get it to jump through the hoops I want using some experimental MIDI interfaces that the infinitely patient Chris Wright of Kineme cobbled together (for me? Maybe <:). There comes a point where you put toy which is making the bad words come out down, and pick up different toys instead.

So I’ve been playing with rendering stuff! Without MIDI (: These are two closely related animations. On the left is Jelly; ’cause there’s a little patch where it looks like jellyfish tentacles, sortof, and I’m fond of jellyfish (but I wouldn’t want to give one a hug). On the right is Points. Because it’s pointy <:

I have no idea if they will display properly (or at all..) if you’re not using a mac. Sorry (: They might also go verrryslooooowly if you’re using much less than a core 2 duo processor. On the other hand, if you have all the grunt in the world; clicking on the movies will take you to a 720p version.