Third iteration of the movie mashup project. Now searching for specific words in a movie — also based on the subtitle.
I am not counting the words in Processing, so I copied and pasted the subtitles from “The Wolf of Wall Street” into Textalyser. The first words by far in the ranking are “you” and “what” — not very meaningful, though.

So I used “fucking,” third one with 211 occurrences. Taking a look at the full list, “fuck” and “fucking” also appear a lot of times. Because they are sort of variations of the same thing, I searched for any time that any of them is said — 431 times total. That happens in 351 subtitles — sometimes more than once in a single line, then.

Another iteration of the movie mashup project. This one searches for repeated lines in a movie. Also based on the subtitles file.
“Groundhog Day” was an obvious choice because of its repeating plot. Interesting to notice that some of the repeated scenes were certainly shot at once, because Bill Murray’s hair looks exactly the same.

This is the first attempt towards my proposal for a generative movie mashup. The code is based on the same Processing sketch I used before to make a mashup book.
The subtitles’ time code (start and end) are used to play the movie jumping from one position to another.
The lines are sorted alphabetically and the video editing is automated based on that.

Idea
I’ve been using delicious since 2004. It has been my one and only bookmarking tool since then. I love that it is a tag-based system, what makes much more sense than a folder-based one.
So for Dataviz’s API assignment, I tried to do something with that. The idea was to visualize how my own interests may have changed from 2004 to now.

Developmenta) My first iteration with the data was just displaying its full content. Because 1374 links is a lot to display on a single screen, I made it as a pdf static poster:

There’s no much to see in it besides the list itself. 2006 seems to have the larger number of links, but that’s probably because I imported a lot of tags from my browser (IE?) when I began using delicious.

b) So I tried to make a force-directed network graph, based on this code I’ve found, from Karsten Schmidt.

The left image is Karsten’s original app. The code didn’t work out for my data, because I had too many nodes. The visualization just keeps moving, directed by the repulsion and attraction forces. If I had more time, I would try to fix that.

c) Because my goal was to see some patterns in my interests, I started to work with the tags instead of the links. These iterations are simple attempts to create a sort of tag cloud.

That starts to show something — art is probably there because is such a broad term that applies for almost everything I tag. However, that is not much different than delicious’ own visualization. And it doesn’t show any time component, which was the interesting part for me.

d) I started to code a timeline showing tag usage x time. Each column in the bubble chart below displays the number of times a tag was stored in a month.

It might not be the best way to display the data, but is certainly a pretty simple and quick to do one. Once again, the image is too big for a screen version, so the final output is another pdf poster.
Zooming in this image was interesting to me. Some things I found out:
– the giant bubble in the corner is the tag imported. It is a default delicious tag for bookmarks imported from a browser.
– Art, illustration and design are tags that appear most frequently — they’re the red and orange ones at the top. Maybe because they’re broad, but may also be a result of things that I am still interested in.
– Technology-related tags are mostly in the blue-purple spectrum. They reflect more recent interests of mine — javaScript, pComp etc.
– Some tags are clearly redundant: data and visualization are two tags, though they always appear together. The same thing for physical and computing.

We sew a foam to the sock, to prevent the plastic from collapsing like last time. We also tried to isolate the circuit as much as we could.

Even so, the results were different each time. It varied depending on wether we were using the Arduino Uno or the Fio, if it was connected to the computer through USB or Wi-Fly…

…and if we were touching or not the board. That led us to a problem discussed in the Capacitive Sensor tutorial: the board needs to be grounded. The page also explains a lot of problems we had, like the laptop acting as a sensor too when connected to the board.
After that, we gave up on the Fio/Wi-fly and decided to work with the regular Uno, for this prototype.
For the software part, we added calibration and “tap detection.” Now we finally have it controlling a video!

Software Development
a) 1st Prototype
After building a relatively stable device to measure conductivity along 2 axys (see previous post), we started working on the software. Though our purpose was to build a simple remote control, we started to test with a sort of trackpad — big mistake, maybe?
For this prototype, we used processing and serial communication.
We first tried to assign an absolute position to the ball, based on the finger position on the trackpad. That proved to be impossible, because people’s charge on the pad changed a lot.
So we made the charge give the ball a direction, like in a joystick — the direction from the pad’s center is translated as a new direction to the ball.

b) 2nd Prototype
After that, we translated the processing sketch to javascript and changed the functions to control a video on the browser.

Hardware Development
a) 5th (?) Prototype
Meanwhile, we replicated the hardware circuit in a non-rigid device, to make it wearable. We sewed the conductive plastic on felt…

…and the on a sock:

Though it looked great as a super-like thing, the plastic collapsed and became very low conductive:

Hardware Development
b) 6th Prototype
A much simpler and more stable solution was achieved when we simply put the plastic on an E.V.A. wrist band: