Jamie and I are pleased because we were finally able to see our collaborative project through to near completion. Of course there were some glitches (we burned out 2 light bulbs so they weren't working when we did the demo) and I need more practice writing code so that I don't feel like I'm flying blind.

Most of the Waveshield code was pulled from examples that Ladyada had posted here:Use Waveshield

Using the example to play once through but allow other buttons to interrupt, we were able to add on code from the switch tutorials for the relays.

A (mostly) accurate wiring diagram was posted on our project update post a while ago, so I'll skip that here. But here are the images of the wiring setups. We had two breadboards: one hooked up to switches and the analog pins on the waveshield, the other for transistors and going to the relays through the digital pins.

Unlike my final project, my conceptual project is completely about the spectacle. At the U I've been studying theatrical lighting due to a life-long obsession with concerts and the light shows that go on during the performances (Pink Floyd's Lazer Show, etc). Going to concerts was my first exploration into the community surrounding music, and it always amazed me how the dynamics of relationships changed once the music started at a venue. In my conceptual project I would use human interactions and gestures to trigger different elements that add to the communal space in a positive way.

Essentially my project involves the audience walking into a completely empty room, with each one of them triggering a sound as they enter. There would be 30 PIR Motion Sensors set around the room: 10 along the baseboard, 2 on each wall, and the rest hanging from the ceiling. By controlling the ranges of the motion sensor to only react to small areas, I can control the amount of movement necessary to trigger specific noises in certain spaces. This way, when people move in the space they can either create a cacophony of percussive noises or they can explore the space slowly and build a musical composition out of their movement. Each noise set off by the motion sensor would comprise of different samples.

This project would explore the dynamics of forced interactivity in an audience, as when there's a strong trust built amongst the members they can create beautiful moments. This project would be great to set up in a large room, where the sensors can be spaced out. Each participant becomes a part of the space and can interact with the others and the space itself by testing out what each sensor triggers.

For my final project I wanted to work with expanding on a simple concept. I had bought a Flip-Flap earlier this semester at Ax Man, and it had never worked quite properly. Flip-Flaps run off of solar power, and work with a capacitor to keep the leaves waving even if the light disappears. It's consistency was always very appealing to me, as the clicking of the leaves act as a metronome constantly beating. So for my last project I wanted to learn how I and others could interact with this consistency.

After researching the Flip-Flap, I figured out that I could disrupt the flow of electricity by using a relay switch in-between the wire connecting the solar panel to the capacitor. I used a PIR Motion Detector to control the flow of energy, making it so that when you waved at the Flip Flap, it would wave back at you. I'm pretty happy with how this turned out, as it reacted as I hoped it would.

This is the Arduino program that I used to control the Motion Sensor:
/////////////////////////////
//VARS
//the time we give the sensor to calibrate (10-60 secs according to the datasheet)
int calibrationTime = 30;

//the time when the sensor outputs a low impulse
long unsigned int lowIn;

//the amount of milliseconds the sensor has to be low
//before we assume all motion has stopped
long unsigned int pause = 5000;

if(takeLowTime){
lowIn = millis(); //save the time of the transition from high to LOW
takeLowTime = false; //make sure this is only done at the start of a LOW phase
}
//if the sensor is low for more than the given pause,
//we assume that no more motion is going to happen
if(!lockLow && millis() - lowIn > pause){
//makes sure this block of code is only executed again after
//a new motion sequence has been detected
lockLow = true;
Serial.print("motion ended at "); //output
Serial.print((millis() - pause)/1000);
Serial.println(" sec");
delay(200);
}
}
}

This wasn't a flashy project as my other ones turned out, but I'm very happy to have explored a softer approach to interactive art by bringing in the use of gesture and interaction with a mechanical object. This is an extremely interesting relationship for me, as I'm still struggling with the extremely reactive role that objects play in our lives today.

Gina Chase was the artist I chose to do my second artist review. It was acutally the very same day that I posted my conceptual project proposal for exploring space and place. I was drawn to her work because of my interest in memroy. I enjoyed her careful attention to details with her layering of images. I also enjoyed the incorporation of mirrors into many of the pieces, as if to question ones real self as upposed to ones representational self. That idea of images and memory resonates with me, even all these weeks later, and I even held onto the newsprinted story.

As a way to experiment and start prototyping physically responsive spatial elements, I developed a simple "windwall" incorporating a passive infra-red (PIR) sensor, an actuated switch, and a number of ordinary house fans controlled by an arduino microcontroller.

As was evidenced by my class demo, the arrangement of the motion sensor to the zone of activity caused the relay to be consistently triggered. I programmed the sensor with 30 seconds of calibration time to create a baseline with a relatively high amount of motion, but it was ineffective at creating the response I wanted. Repositioning the sensor, creating a smaller view cone for the fresnel lens, or using a PIR with a manually controlled sensitivity would have made the interaction more satisfactory.

One very helpful tool was the PowerStripTail, available here for less than $20. It is essentially an independently powered relay that allows for the conversion of electricity between the 5v microcontroller and up to 120v AC. I powered three house fans on an ordinary powerstrip plugged into the PowerStripTail. This greatly reduced the time and circuitry required for this experiment.

My concept project incorporates the exploration of visual and sonic relationships produced by ecosystemic data mapping. More specifically, I'm interested how distinct spaces sharing a common boundary (e.g. rooms in a building or buildings within a university) could be melded into a common space, or "composite audiovisual ecosystem."

I would use microphones to track the sonic profiles of multiple distinct environments, preferably public spaces--libraries, hallways, cafes, playgrounds, etc. Using custom software, I would extract frequency and amplitude information from these signals in realtime and transform them into a series of data streams. These fluctuating data streams would be structurally coupled to various sound parameters of the audio signals being tracked, as well as video of the environments. This would form a "net" of data connections among the various spaces. This "net" of data couplings would enable the characteristic sounds events of each respective environment to induce change in the audio and video signals of the others, thus informing the overall audiovisual output of the piece. (The audiovisual output would include multiple realtime video projections as well as a multichannel speaker array.) In effect, the composite audiovisual output would represent the interactive intersection of multiple spaces in a single environment.

For this project, I come up with an idea about how to make the LED light become a signal that can direct people's physical actions, so people will have a chance to be directed by LED light and become a participant. So I come up with an idea of running game, and the LED light will blink into different colors and blink at different frequency, and each color represent a certain object and a certain frequency represent the certain distance, so firstly, when players see the LED light, they need to transfer the light into certain signals, and actually the interactivity here is different from the traditional notion of interactivity, like the traditional one is to actually touch or smell or something to be interactive. So, people transfer the visual effect into signal and for running part, they need their respond ability like who can respond to the frequency and color changes faster who can win the game cause they take actions first

It is nice to be extremely fully knowledgeable in new areas of thinking. Before this year I knew nothing about Max, Arduino's, programming computer chips, LED's, electronic breadboards, Blogs, Media Mill, or Final Cut pro. I thank my teachers and especially my fellow students who helped me achieve this goal.
I leave this Mark Twain quote to the motor mouths amongst us "Better to remain silent and be thought a fool than to speak out and remove all doubt."
-- Mark Twain
-- Take care I truly enjoyed this year with my fellow students
-- Lance

For a general purposes, I believe that anything that can be considered "art" is also inherently "interactive" in one way or another. Thus, my definition of Interactive Art is just as impossible to explain as my definition of Art.

However, for purposes of this class, I will define

Interactive Art : "a performance or installation created by one or more humans, created for and dependent upon a second party of humans to experience in a personal and engaging manner."