A lamp is quite useless. Does not talk, cannot dance and is boring. This
is where I try to help.

Luci is an autonomous lamp with a webcam, a High Power LED in the lampshade, and five servo motors. She is controlled by an Odroid U3. Once switched
on, she looks around checking the environment for any beings and then does what she wants.

Mechanicswere the riskiest part, that’s what I thought. I bought a desk
lamp, measured its dimensions and tried to give it more childlike characteristic by making the lampshade too big and the arms too short. TurboCAD helped to model the lampshade and the
joints.

Woodwork was
next.After having spent hours in the basement and cutting the hinges out of
multi-layered birch wood, it turned out that the friction between struts and joints was too high. I reconstructed it with ball bearings, went back to the basement, tried again, and noticed that the
springs needed different mounting points, changed that, went back into the darkness and breathed in the fine dust again. Especially after adding the ball bearings I was glad having used TurboCAD,
since this required four slices per hinge with very precise drill holes.

Lampshade.I did not want to repeat that dusty experience with the lampshade, so 3D-printing appeared
very tempting, and I anyhow wanted to try that out. Below you see the 3D-model of the lampshade. In the inside, one can see a small platform the servo motor will be mounted upon (marked red). The hole in front of the platform (marked green) gives space to the
axis of the servo motor mounted to the upper arm of the lamp.

The following two pictures show the construction inside Luci's head that will store the electronics. The hole at the upper part of the cover disc will give space for the camera, the half
hemisphere is of transarent plastic with a small LED inside, and the wodden lever is used to turn the head with a servo motor. The rolls at the botton of the lever will reduce the torque of the
head nicking servo motor by use of a spring pulling up the head. Hard to picture, but it will become clear when the final construction will be ready.

I gave it to a 3D printer shop. After paying a surprisingly high fee (120€!) I had a piece of ABS in my hands: Looked a bit ugly and plastic-like (okay, it is plastic actually), but after painting
with black and white varnish it reminds of metal at least.

Then I went to the basement and started
the wodden construction of the lampshade's inner structure as designed. On the left is the first approach, with a a staircase-like aperture to prevent light from interfering the camera
image. Still, it did not work out. So I replaced it with a more unobstrusive hole. In the background, the first version of the base can be seen. Again, really
ugly, but I was about to improve the design.

Hardwarewas next. I started
with a BeagleBoard, but it became clear soon, that it is too slow for facial recognition. So, I ordered an Odroid U3. Unfortunately, it does not provide five PWM outputs for the servos, so there is a
need of a separate uC generating PWM signals. In addition, the Odroid is sensitive to voltage flaws, so after having connected the servos to a separate board with its own power supply, sudden resets
of the Odroid did not occur anymore. The Odroid with the self-made Arduino shield was supposed to be placed in the base housing of the lamp, but in the end I did not place it there, since I was
scared of having a hot Odroid under full load within a wooden box full of splints.

Software.The programme
runs with Ubuntu in C++ with multiple threads to leverage most of the four cores. The first thread grabs the image from the webcam; a second thread takes these images and tries to find all faces
currently looking at Luci. The result is a list of faces, and Luci focuses on the biggest one.

A third thread runs the trajectory planning algorithm, which produces a sequence of points and orientations in 3-dimensional space generated by
certain patterns. When no face is detected, Luci runs a search pattern looking for faces by sampling the environment until a face has been found. Then, Luci carries out a pattern simulating real
emotions like nodding without knowing why, pretending to listen, coming closer or retreating depending on the movements of the face. It’s like in real life at home.

Trajectory Planning.The implementation of the trajectory patterns is rather simple; whenever Luci runs short of
queued points to be approached she invokes a pattern point generator which is parameterized with the current pattern. There, the next point of a predefined sequence of movements is generated. In case
of the pattern that interacts with a face, this is:

move back quickly (surprise, recognizing a familiar face)2. move slowly towards the face (first shy contact)3. watch face from various angles (closer inspection)4. move down and watch the face from a frog’s perspective (looking cute)5. go up again and nod (pretending an agreement)6. …

Some patterns with special movements are hardcoded, e.g. when Luci pushes a box from the table or looks guilty for watching dirty pictures (1:34 and
2:00 in the video).

Finally, the main loop takes the previous and next point of the trajectory and interpolates all
intermediate points with 60Hz using a cubic Bézier curve to smooth the movement. The support points of the Bézier curve are geometrically derived from the trajectory’s prevprev (A) and nextnext (D)
point by the rule shown in the picture: Since any polynomial with higher grade tends to oscillate when support points are too far away from each other, so I kept them in a constant distance of |BC|/3
to B resp. C.

Mass Inertia. The last step also computes the lampshade’s
acceleration, since the Bézier curve does not take into account that 400 grams are moved in total.As a consequence, the mass acceleration
needs to be limited by ½ g to prevent flapping caused by the elastic construction and the backlash of the servo motors. This is done by checking whether the next position can be reached without
accelerating above the limit. If not, the new position is computed by taking the current position and adding the maximum distance (on the basis of the current speed and maximum acceleration capped by
½ g) along the current speed vector. In the end the result curve leaves the Bézier curve where it is too sharp.

Kinematics. The output of all this is a 3D-point which is passed to the kinematics module that computes the
angles of all servo motors (so-called inverse kinematics). This part is textbook robotics, it works as follows:

The algorithm starts
with the point / orientation (=tensor) of the head’s centreA. First step is to compute position B and C out of the head
orientation.This can be done by computing the point C relative to the position of A (C.point-A.point), rotating that by the orientation of the head (A.rotation), and adding it to the point
A.

C := A.point + rotate(C.point-A.point, A.rotation)

Then, the base angle at F, which is the servo angle, can be computed by

F.angle := atan2(A.point.z, A.point.x)

The angles at E and D are computed by considering the triangle EDC and computing its angles with the cosine law

These angles are passed via I2C to the ATmega, where the Arduino library generates a 60Hz PWM signal for the servos.

In the beginning I was scared of the high CPU use of 3D kinematics and tried to implement it with fixed point integers and interpolated trigonometry
(I was used to a 24MHz ATmega) . What a surprise when I recognized that using floats and sin/cos with no caching or table lookup had no noticeable performance impact on the Odroid U3.

Facial Recognition.The facial recognition module
uses OpenCV 3.0 with Haar cascade classifiers. Although the newer LBP cascades are significantly faster, they had many more false positives, so I thought 10 fps with Haar cascades is sufficient. From
the 2D-position of the detected face, the 3D-position is estimated assuming a standard face size which worked surprisingly well. Later on, Luci’s trajectory planning module moves towards the face if
it is very close to simulate real interest and moves away if it violates the European intimacy distance.Tracking a face in real time was a
bit tricky, since grabbing images from the video stream + face recognition has a latency of 250ms . So, the computation of the face’s 3D position needs to be done relatively to the webcam’s position
250ms ago. Consequently, when Luci moves quickly, this does not work satisfyingly when the image becomes blurry, so the orientation of the head is directed towards the last stable face position until
Luci moves slower and following the face in real time becomes possible again.

The trajectory planning module is computing the next two points in advance for calculating the Bézier curve. Consequently, the detected face position
is not valid anymore when the Kinematics module is sending a position to the servos a couple of seconds afterwards. The solution is to permanently compute the current 3D position and send that to the
Kinematics module, in order to change the head orientation towards the face in real-time.

This project changed me not at all. Like at the office, everything just took longer than expected.
Surprisingly the software and hardware worked out quickly, but getting the construction and the mechanics in a shape that worked, was not too heavy, with properly mounted servos and springs took me a
couple of weekends in the basement. The Maths were definitely challenging. Getting facial recognition done was the simplest part, but gets the most ahhs and ohhs. The guys from OpenCV did a pretty
good job at making this really easy. The most fun was the trajectory planning, i.e. how Luci should move when the webcam recognizes a face moving.