Author: Alberto

I am fascinated by videos of happy children that plays with pieface game. I have never had this, so i decided to build myself an adult-sized, low-budget version, based on ESP8266 board.

TODO: VIDEO OF MATCH

Rule of the game

Pieface is a 2 player game. Each player push a button repeatedly. The challenge push the button faster than the opponent. When the winner win, the looser received a pie in face.

From a mechanical perspective the pieface game is basically an orientable catapult; the trigger is the difference number of button pressures.

Parts of System

2 players

2 buttons

2 rc servos:

1 direction rc servo

1 unlock mechanism rc servo

1 microcontroller

Springs

Lego technic beam

Spoon

Cream

Battery

Microcontroller

I choose a NodeMCU board (and Esp8266 powered board) because is cheap, Arduino compatibile and I already have one in my house. In this project I don’t use wifi, but with a wifi-ready board any upgrade in the wifi direction (i.e. tablet or website) is feasible.

Wiring

The wiring of the electronic part is minimal. The two buttons are connected on D4 and D5 (pull-up mode), the command pin of the two rc servos are connected D0 and D1.

Software / Esp8266 code

The program on the Esp8266 count the delta between button press. This delta guide the direction rc servo (that orient the spoon on the face of one player or another). When the delta reach a limit, the second rc servo unlock the spoon pouring the cream on the loser.

Yesterday I received a Raspberry Zero W, that for approximately $10 include Wifi and Bluetooth (byebye dongles) but no Ethernet port.

This post contains the instructions for configuring the Raspberry Pi Zero W so that he connects to the wifi network at the first startup of Linux, without wasting time (after that, you can log on via SSH).

Insert the SD in your Raspberry Pi Zero W, attach the power USB, connect your computer to the wifi, scan the network to find the raspberry IP address (i use lanscan on OSX) and you can connect via SSH (the default password is raspberry)

Code

TLTR

At the beginning I wanted bypass entirely the intercom, but after few attempt I realize that the intercom protocol is not well documented. Audio, video and signal all on 2 wire at 26-28V. Definitely easier to simulate the press on the intercom button using a relay.

After write the Esp8266 code with Arduino, I wrote the server parte with NodeJS and a webpage.

From an architectural perspective the initial system is composed by a single server and two client (Esp8266 and Smartphone) that communicating with a persistent connection (websocket).

Due to the websocket connection limit of free azure websites (max 5 connection) an other economic considerations, I upgrade the system to limit the websocket connection only to the indispensable Esp8266 side, letting the smartphone using classic HTTP requests to send commands to the server.

I build a webpage with a on/off butto to test the system, and it works! After 2 weeks of continuous running without any problem, I am fully satisfied: I can open my door with a touch on my smartphone.

But it would be great if I could open the door without touching the smartphone. So I started to investigate the conditions and technologies to upgrade the system the next level.

As often happens, any simplification for the user adds layers of complexity to the system, and the variables become many:

location (gps and wifi based)

walk paths

hours of the day

days of the week

wifi ssid available

network to which my mobile phone is connected

The approach based on a web page was no longer sufficient: i switch to native app.

Recently I build an Oculus Rift like low cost alternative to simulate the original device.

I have a smartphone with hd screen and find on Amazon an interesting kit (2 lens, 1 strap). So, I build a lego structure to bring together this elements, and wrote a small ios app that show two view, one for eye.

The gyro and accelerometer sensor make possibile to track the head movement, so the last ingredient was to find an app that show multiple 3d view based on sensors.

I not find the app, so I wrote a small ios app that show 2 streeview frame, rotating the view point based on sensors. The result is simple and awfull 🙂

After some day of building, conding, and testing, I publish the first photo of my Lego Ruzzle Solver.

The robot use 3 NXT motors: one for x-axis movement, one for y-axis movement, and one for move the “finger” up and down. To decrease friction, the two movable parts of robot are supported on rigid wheels, like a bridge crane.

A curious thing: the capacitive touchscreen of the tablet require human fingers (attached to human body) to dispatch touche event. Because attach fingers to lego mindstorm motor require detaching fingers from my body, I have discover that conductive sponge is a very good alternative, and because vegetal sponge is conductive, I have buy it at supermarket for 1 euro saving my real fingers.