VJ

overscan’s VJ past

Overscan were Michael Beverley and Paul Martell-Mead. Best described as accidental VJs who you may have found performing their unique blend of club visuals in London clubs from 1994 to 2000.

It all started when Michael and Paul bumped into Jez, the promoter for Tribal Energy on a train. Michael had been designing Tribal Energy’s flyers for some time and had attended a couple of their events. Michael had been experimenting with (bad) ambient music and asked Jez if he could perform in the ambient room. Michael also suggested that his friend Paul could bring his Amiga and play back some Amiga Demos on a video wall that Jez had previously mentioned. Jez said “Forget the ambient room. Come on the main stage with the band. We’ll get a video projector and you can show the demos whilst the band are playing”. Paul rightly said that no way would they project other peoples work so they had to come up with something else in the 6 weeks before their first gig.

Their first gig was performed using a single Amiga 4000 with a rockgen genlock and a single VHS video player. It was somewhat limited but still great fun.

Their setup evolved over time and the classic multi layered setup involved two foreground layers with Amiga 4000s and lola genlocks. Underneath that background layers consisting of a Perception PVR, MPEG2 board plus MJPEG board (all 3 hosted in the same PC!) were mixed with a pansonic WJAVE55.

They were resident VJs at Tribal Energy (1994-95), Club Alien (1995-97), Restless Natives Trancentral (1997-99) & Fevah (1999-2000) at many venues including Cloud 9, The Theatre Factory, Electrowerks, The Rocket, The Fridge, The Forum, Heaven and The Astoria.

Overscan are responsible for the incredible computer-generated projections you’ll see if you go to Trancentral (and maybe Submaniac and Tsunami). The legendary penguins, Fishocalypse Now, interplanetary pineapples, marching pink dragons – and the real thing is weirder than that sounds. 🙂 from uk.music.rave people

Overscan’s VJ future

Michael has been working on a web based VJing platform. There are two components. A web based visual sequencer and an output window. Currently this only works in google chrome as the output window uses Webm video with alpha channel and both components use midi for communication.

At the moment the app/website/thing has 10 distinct layers, though some of those layers live within a canvas webgl object and background video A & B both have 2 internal layers to cross fade between videos.

Foreground video A (webm with alpha)

Foreground video B (webm with alpha)

WebGL – Dancing creatures

WebGL – Audio waveform

WebGL – Audio spectrum

WebGL – Kaleidoscope A

WebGL – Kaleidoscope B

GIF colour cycling

MP4 background video A

MP4 background video B

All of the layers can be manipulated using the sequencer component or using a pair of AkaiMIDIMIX mixers.