Pi-Controlled Billy From The Saw Horror Flicks

[David0429] has made a very scary Raspberry Pi controlled puppet. Scary that is if you’ve seen the Saw movies where a serial killer uses one like it, called Billy, to communicate with his victims. If you haven’t, then it’s a pretty neat remote-controlled puppet-on-a-tricycle hack.

A stepper motor hidden under the front fender moves the trike by rotating the front wheel. It does this using a small 3D printed wheel that’s attached to the motor’s shaft and that presses against the trike’s wheel. Steering is done using a 3D printed gear mounted above the fender and attached to the steering column. That gear is turned by a servo motor through another gear. And another servo motor in the puppet’s head moves its mouth up and down.

[David0429] took great care to make the puppet and tricycle look like the one in the movie. Besides cutting away excess parts of the trike and painting it, he also ran all the wires inside the tubular frame, drilling and grinding out holes where needed. The puppet’s skeleton is made of wood, zip ties and hinges but with the clothes on, it’s pretty convincing. Interestingly, the puppet in the first movie was constructed with less sophistication, having been made out of paper towel rolls and papier-mâché. The only things [david0429] would like to do for next time are to quieten the motors for maximum creepiness, and to make it drive faster. However, the need for a drive system that could be hidden under the fender resulted one that could only work going slowly. We’re thinking maybe driving it using the rear wheels may make it possible provide both speed and stealth. Ideas anyone?

In any case, as you can see in the video below, the result is suitably creepy.

Great job on the puppet. I have to say it is to slow.
And putting 2 geared motors on the rear will help. Sorry but way to slow.
And the taking could be better with the mouth opening and closing when he talks, not just open.

Like I said great job on the puppet and Bike. Keep up the good work.
And thanks for sowing your work.

for lip animation, we’ve had great results animating animatronics from a text-to-speech engine with just a rolling average of the amplitude (sample value) and a fixed falloff.

later in Unity3D I did something similar. The models had several face morphs for vowels, so we selected a few and did a realtime FFT or even goertzel for specific bins corresponding to the phoneme in question, and then used the same moving average filter described above to control the morphs. The results were more than good enough, and we could just throw in any audio file and not worry about doing lip/jaw animation.

For your animatronic version, did you do anything special to sync up the amplitude changes with the motor movements? Often when I see lip synced robots, the mouth movements lag a bit, presumably because the software starts the sound and then moved the motor (been there, done that :)).

if i remember correctly we processed the audio file before playing it and there might have been a fixed delay, i can’t remember for sure but i don’t think so. The movement was done with a simple RC servo.