I’ve decided to put up some how to’s for Max/MSP/Jitter, for people wanting to do some “real world” type work. I’ve used Jitter for doing some post production/effects work for a music video (or two now..), and note a lot of the examples posted to the Max/MSP/Jitter lists are abstract, and don’t really walk through the thought process of constructing a well though out Max patch – Thus ‘Real World Max’ is born. I should probably title it ‘Real World Jitter’, because MSP is not my forte…

Regardless – lets get down to business:

Glitch

Its hot, it looks sexy and its in (or am I late to the party again?). But how do you do Glitch on purpose? Well, first lets define what Glitch is:

Pure Glitch

Is the result of a Malfunction or Error.

There is a great deal of scope in the discussion of what can be classed as a Glitch. Primarily, in a theoretical, scientific and non-art sense, a glitch is assumed to be the unexpected result of a malfunction. The word glitch was first recorded in English in 1962, during the American space programme, namely in the writings of John Glenn where it was used to “describe the problems” they were having. Glenn then gives the technical sense of the word the astronauts had adopted: “Literally, a glitch is a spike or change in voltage in an electrical current.” (John Glenn, cited in American Heritage Dictionary 4 th Ed (2000) )

So in a sense the glitch has always been associated with the definition of a problem. It’s a word used to describe the result of a situation when something has gone wrong. Admittedly, it is also a problematic and contradictory area of study as we shall find out.

According to Motherboard in their advert for the glitch symposium, Norway 2002: “Glitch” is a commonplace expression in computer and networks terminology, meaning to slip, slide, an irregularity, a malfunction or a “little electrical errors”

In my discussions with Glitch artists and digital artists, the glitch I have described was also referred to as a real or pure Glitch. The Pure Glitch is therefore an unpremeditated digital artefact, which may or may not have its own aesthetic merits.

The Glitch-alike

Glitch artists either synthesise glitches in non-digital mediums, or produce and create the environment that is required to invoke a glitch and anticipate one to happen [..]. Because of the intrinsic nature of this imagery and its relation to pure glitches, both in terms of process and viewer perception, I felt the need to form a word that adequately describes this artefact’s similarity with actual glitches and presents it as an obviously separate entity. Thus the term “Glitch-alike” came about to fulfil this role. Therefore, Glitch-alikes are a collection of digital artefacts that resemble visual aspects of real glitches found in their original habitat.

We want to be able to make glitchy looking material. Essentially, we want the look – not the errors. What does it look like? Take a look at Glitch Browser to get a feel for what some input pre and post glitch look like. If you are on OS X, I suggest opening up Console.app in /Applications/Utilities and notice all the Huffman errors on those images. Try opening them up in Photoshop. You will probably get an error and it wont open. This is because Glitchbrowser made an actual glitch – ie : it took a perfectly good jpg and munged the headers, shifted some bits and sent it on its way. The result? Your jpeg decompression engine parsed those flipped/mixed bits and made a very nice digital distortion of the image. This is an important aspect.

In Pure Glitch, the decoder/viewer/parser engine is responsible for generating the disrupted output. In Jitter we typically only have access to matrices after they have been decoded. If you load in a quicktime movie to jit.qt.movie, you are given the uncompressed matrices out of the objects outlet. Jitter and Quicktime have transparently handled the decoding stage for you, so ou never see any Quicktime header information, raw bytes, or have the opportunity to glitch the input to jit.qt.movie or jit.qt.grab from within Jitter itselt. There are ways around this, but for now we are going to explore controlled Glitch-alike methods from within “traditional” Jitter methods.

As an aside, this is what we dont want to do... This technique is not real time, its limited, and you cant use it on live footage. We want Jitter to do this all for us on saved or live footage, in realtime, in response to our whims..

Breaking down the look

Ok, so what exactly is going on here visually? We have discoloration (super/de saturation – hue shifts), we have added noise, we have displacement/quantification (pixilation). Iman describes common Glitch features as fragmentation, repetition, linearity and complexity.

Lets identify some objects in Jitter that can get us going and build some tests. Our strategy is to divide and conquer. We want to get the little pieces working and to understand what the objects are doing, and then put it all together.

First, were going to need some video input, so we need a jit.qt.movie or a live jit.qt.grab video input. We have a bunch of options for the rest of our objects, let go through group by group:

Check out these objects help files and get familiar with what they do, and what options they have. Lets start out with adding some noise to a movie. We note that the Glitched noise that is added to Glitched images is blocky, so lets start out with a small jit.noise object and add some digital interference to some video via jit.op:

Patching

This is far too formulaic. Its boring, and it doesn’t look right. What’s wrong? Well, for one, the noise is too regular – meaning, its constantly noisy in the same way- square, in the same places. Glitched noise is in random places, is more rectangular. Lets see if we cant get some noise that looks like that. Our strategy is to take 2 noise generators at different dimensions and combine them in a way so that we get a bit more non-uniform rectangular distribution of our noisy signal.

This is a lot better, but still feels a bit too ‘uniform’. We’re perfectionists and we want to get it right. Lets throw in a little something to shift our noise around randomly. Lets use jit.tiffany from our list above.

This works much better, especially at higher dimension noise input. it ‘feels’ more glitchy, especially when we give large ‘aspect ratio’ noise, like dim 20 3, or dim 15 4 combined with smaller more square noise like dim 3 2 or dim 4 5, etc. Ive left in the jit.op ubumenus so you can play with blendmodes, but I found bitshift right ‘>>’ to be the best. Use your discretion.

Now that we have our noise working, lets make our displacement/reposition patch separately, and get that working well. Lets start out with jit.rubix and see what we can get. Open up the rubix helpfile and load in a movie you know. Lets play with it and see if we cant get some glitchy-ness. The first issue we have is that jit.rubix is limited to columns and rows, and it ends up looking much like our original ‘noise’ patch from above. Its to regularly displaced. We need to find an object that isn’t limited to columns and rows for displacement. Lets check out jit.repos. Open up its helpfile and play around.

Yes, we have a winner. Jit.repos takes in a matrix and applies as a displacement map it to shift pixels. Lets modify our noise generator (note jit.repos takes a 2 plane matrix, not a 4 plane char matrix like normal video) and hook it up and see what we get:

Wow. We could almost call it quits here. Our noise generator and tiffany add on are generating very glitch like repositioning maps. Without the pwindow’s and extra overhead, the patch is small, and it runs fast. It looks great. But we want more.. lets combine our two working patches: discoleration and displacement. Note, we’ve already got our random noise generator from before. Being good coders, were going to re-use its noise and use it for discoleration and displacement. No need to do things twice.

This is basically it. Here, the beauty is that our displacement map is our discoleration map. Wherever there is displacement in the image, there is discoloration, which is essentially what we have going on in actual glitch.

Final Touches

Being perfectionists, we know there is still room to improve. Lets add some extra touches to spruce things up and keep our glitch patch from being monotonous. Lets add some UI objects to give us control over our noise, and (dis)coloration, as well as the frequency of our noise generation. This isnt the end all be all of Glitch-alike (obviously), but its a good starting point for a patch. We are really only going after one particular style of gltiched visuals. Check out the Glitch Aesthetics site and see if you can re-work the patch to be more dynamic in the types of glitch it produces. Also, a few other objects to play with : jit.scanwrap jit.rubix. Jit.rubix has a great ‘probability’ message that will ‘hold’ a portion of the image, much like a DV stream that is erring. Heres our “final” patch.

thanks for this revelatory glitch page. i found it super useful and helpful as i develop my own style with jitter.
seeing other’s patches helps so much as i think that’s how i realy learned csound, supercollier and pd.
applause to vade!

Thanks! this tutorial is really helpful as far as learning how to dissect visuals. If you tossed a few more of these screenshot vs thought process tuts out there, I and the other visual n00bs wouldn’t mind

Also, you mentioned that [jit.repos] needed a 2 plane matrix, however I accidentally patched 2-four plane matrices in, and there weren’t any problems (which also got rid of the need for [jit.coerce]). Perhaps [jit.repos]just ignores the other two planes?

[…] Anton Marini (Vade) kindly referenced me back in 2006. http://abstrakt.vade.info/?p=48 He’s now consistently been doing work which I highly rate. I want to get experimenting with Quartz Composer. […]

Thank you very much, this tutorial was quite helpful and instructive! As you noted, the abstract nature of the example files in the jitter documentation and examples on the forums can be frustrating, so it’s great to see a concrete example that can be quite useful.

Hey there, this is awesome and looks exactly like what I am looking for for my live performance.
I am new to Max/MSP & Jitter and I was wondering if this patch is able to be controlled via midi (with a controller) within Ableton Live with MaxForLive? Any help in understanding how to set this up would be massively appreciated.
Thankyou. 😀