Sunday, November 10, 2013

RS4 - Self balancing Raspberry Pi OpenCV image processing Robot

Here is the robot that I'm working on, you can see the latest video here, although it suffered some modifications since then.

I'll divide this description in topics as it's easier for me to describe it this way. The idea to build this robot came from buying a Raspberry Pi, when I saw it I said "I've got to build a robot with this " :) . I have built other robots in the past but this one is the most complex and the first with image processing.

Chassis

The robot chassis was designed by me, I used a 3D tool to generate some previews mainly because I needed to have an idea of the size and components distribution before build it. Here you can see the model of the robot:

After this I began the building process, I bought a carbon fiber plate (more or less the size of a A4 sheet) and cut all the pieces by hand with a mini drill machine (unfortunately I don't have a CNC machine to do this job). I bought some aluminum profiles to make spacers and fixing parts as you can see in the next photo. The result is a very light and strong chassis.

Motors and wheels

I'm using stepper motors in this robot, no special reason for that. I bought them as Nema 17 motors, the motors reference is LDO-42STH38-1684A 121121﻿ LDO MOTORS. These type of motors have a nice robust look and are usually used in CNC and RepRap machines.

The wheels are from a RC 1/8 Buggy, you can find it easily in any RC store as they are standard size. What I like the most in this wheels is their soft touch, this way they work as a damper for small obstacles allowing smooth run.To connect the wheels to the motors I used Traxxas Revo hubs and nuts like showed in the photo, these are the only ones that I found with a 5 mm hole, the same as the motors shaft. This way is more or less plug and play.

Head

For pan and tilt I use 2 micro servos (Tower Pro MG90S), very cheap and easy to get. The head has a holder for the Raspberry Pi camera module, a ultrasonic sensor and 2 RGB LEDs.

You can see some details of the robot in the next photos

Balancing and motor control Board

This robot uses a dedicated board for balancing and motor control (I want to use Raspberry Pi only for high level tasks). This board is my design and it uses the following components:

- 2 L298 + 2 L297 stepper motor drivers, (yes, I know they are old but they are cheap and easy to find to, in a future revision I'll use something from this century :) );

- Murata ENC-03 Gyroscope, analog single axis gyro, very easy to use;

- MMA7361L Accelerometer, 3 axis analog accelerometer (I use a module, this chip is to small to hand soldering);

- PIC24FJ64GA002 microcontroller

It allows I2C and serial communication. Photo of the board and the motors here:

Servo control board

I'm using a modified motor board to control two servos and to read the ultrasonic sensor (not yet being used). This is a temporary solution, I intend to design a dedicated servo control board or buy one.

Power

The energy to power the robot comes from a 2000 mAh LiPo 3S battery. To generate required voltages I'm using one 3.3V regulator and two 5V switched regulators. I want to design a dedicated power board in a future revision.

Balancing control

PID

Balancing control is performed by a PID cascade, like showed in the next picture. This way is possible to balance the robot even if you move the center mass or run it in a ramp. It will find a new balance angle that allows it to be balanced and stopped. In fact both the controller are PI only, the derivative gain is set to 0 because it causes the robot to shake even with small gain.

PID implementation is as simple as this:

pTerm = Kp * error;

sum += error;

iTerm = Ki * sum*Ts;

dTerm = Kd * (error - lastError) / Ts;

Cn = pTerm + iTerm + dTerm;

lastError = error;

For PID tuning I used a Bluetooth module which allows me to adjust Kp,Ki,Kd for both the controller in real time. This way you can immediately view the effects and reach the desired behavior for the robot. In this video you can see it successfully balanced for the first time .

Sensor fusion

Sensor fusion (gyroscope + accelerometer to get the leaning angle) is performed by a Kalman filter, not much to say about it, it works really well. Follow this fantastic tutorial, here is everything you need to know, includes explanation and implementation.

Motion control

OK, the robot is balanced but now it is necessary to move it. Moving forward an back is quite easy with this PID cascade setup, you just have to give a set point to the first controller and it will calculate the appropriate leaning angle to reach that speed. Something like this: setAngle = calcCn1(instSpeed - setSpeed);

instSpeed= calcCn2(angle - setAngle);

To turn the robot I'm attenuating the speed in one wheel, depending on the side it needs to turn. This way the robot keeps the balance as both wheels are reflecting the control system speed. Implementation looks like this:

instSpeedL = instSpeedR = instSpeed;

motorSpeedL(instSpeedL * factorL);

motorSpeedR(instSpeedL * factorR);

0 ≤ factorL ≤ 1, 0 ≤ factorR ≤ 1

To perform spins, rotating in turn of itself, what I do is to give an opposite offset speed to the wheels. With the wheels rotating symmetric speeds it will perform a spin and stays balanced, completing the implementation it will look like this:

motorSpeedL(instSpeedL * factorL + spinSpeed);

motorSpeedR(instSpeedL * factorR - spinSpeed);

If spinSpeed is positive the robot will spin clockwise, other way it will spin counter clockwise.

That’s the way I found to control the robot motion, there are possibly other methods. Other important thing is that with stepper motors you shouldn't apply big speed changes abruptly or they will slip, this can be solved with some low pass filter applied to factorL/R and spinSpeed. This way works well in my robot. In this video you can see the a run with Bluetooth control, it can run faster than this but will easily fall if it finds some small bumps on the road.

Raspberry Pi

I'm using a Raspberry Pi model B 256 MB with a micro SD adapter because of the limited space on the robot. I have a small WiFi adapter but the robot is not yet using it. The installed operating system is Raspbian, I managed to get OpenCV working with the Camera module thanks to this tutorial, great stuff here:

At the moment I'm using serial communication between the Raspberry and the motor board and servo control board but I intend to use I2C as it is a more appropriated method. The reason I'm using serial now is because the interface code was already done for the Bluetooth module (it is a cheap serial Bluetooth module). I have to spend some time working in the I2C interface.

Serial interface with the Raspberry is quite easy, you just have to disable terminal emulation. I'm using WiringPi library to achieve serial communication and to control Pi's GPIOS without any issues.

Image processing

I have very little experience with image processing, it is the first time I'm using OpenCV and I'm still learning how to use it. My first example is the object tracking (ball) by color filtering like in this tutorial:

It works well but is sensitive to lightning changes, at this moment I'm using YCrCb color space instead of the HSV but the results are similar. With the object coordinates in the screen I control the servos to point the camera to the object and control the robot direction based on the head angle.

The ball following was the first simple example to integrate all the parts of the robot, the robot behavior was funny and I decided to publish the video on youtube.

Final remarks

This robot is an ongoing project, I'm continuously building new parts and modifying others. I don't have a defined goal for this robot, I would like to give it some autonomous navigating capabilities. It has some real potential I just have to work on image processing and learn some more technics. I intend to add a speaker too.

In the initial robot sketch it has 2 arms, it would look cool but it gives a lot of work to build and I'm aware that is hard to give it some useful function like grabbing objects or something. I could use arms to get the robot back on balance after a fall, maybe in a future update.

I have implemented odometry in this robot, at the moment I'm not using it. A 3 axis gyro would be very useful to correct odometry angle errors, a point to review in future revision.

Hello!Thanks for your reply. Yes, I am :P I´m in Povoa de Varzim, what about you?I tried to do all the programming in the RPI with geany, but I given up that idea because of the speed. I thought I could do a lot better with a familiar interface like VS along with the speed of a few gHz cpu, but I failed to compile openCv libraries for windows with errors I know nothing about and found difficult to find solutions on the web.So I tried with Linux on a VM, and it worked, if it was not for some hardware problems related with crossing the usb camera from the main OS to the VM OS.I finally decided to grab an old computer with Linux and compiled the libraries there. It is working fine (with some adjustments) but the best I can do is to place the code there and debug it, and then grab that *.cpp and place it on the PI and compile it there. So far all the code that worked on the computer also worked on the RPI, wich shows a good compatibility between OpenCV versions.Your work is very impressive and it´s very nice to see the that robot doing it´s thing.Please keep posting new stuff!Thanks.

Perguntei porque o teu nome parecia português. Eu estou a morar no Porto.I'm doing something similar to what you're doing, I'm using a Linux virtual machine with OpenCV on Eclipse IDE. I do all my coding first on the computer and when I get something I like, I compile it in the RPi. Never had a problem with compatibility between OpenCV versions.

From now it will be hard to work in the robot, I have a new job and I have not much time for it. If you need something just ask.

Até estamos próximos. Queria ver se conseguia dominar a linguagem do OpenCV o mais rápido possível porque queria ter o robot em que estou a trabalhar pronto até Janeiro. Mas ando um bocado á nora com os processos do OpenCV. I´m really not very good with C++, so it´s being difficult to elaborate a code for what I want. The arrows functions that you have developed are exactly one of the features I was thinking on putting in the robot. How were you able to master OpenCV? Did you follow the tutorials or something? Sad to know you will have little time for the self balancing robot. But hope to see some more amazing projects from you.

Hi!I was wondering, is the raspberry self contained? What I'm trying to say is did you run opencv on a monitor and then disconnected the pi, or is it communicating in any way maybe wireless to a monitor?

I didn't follow any particular tutorial to begin with OpenCV. What you can do is install it in your PC and start playing with it. There are many examples in the web of object tracking that you can try. Even in YouTube you can easily find some tutorials on that.

Hi, I have just got my 1st raspberry pi and saw your project and thought it was cool and would be something that I would like to build. Could I use a usb webcam instead of the pi webcam? How are the servos driven for the head as it wasn't clear from the info you have shown.is it from a separate driver or directly from the raspberry pi?Thanks for any help you can give.

Hello PaulYes you can use a USB camera but it will be slower than the Pi camera board (see the thinkrpi website). Head servos are not controlled directly from the Pi, it is a microcontroller that generates the signals.Good luck for your work :)

Hi, well done! I am doing something similar for a school project, we are tracking a color using the Raspberry Pi camera board. I find that when runing OpenCV on the Raspberry Pi, my image tracking is running very very slow(about 2-5 fps) which is entirely too slow for real time object tracking. Your robot seems to be responding very fast. How much fps do you get? How did you acheive this? Any tips would be great!

How did you do, that your robot doesn't turn back to spot when you push it? I made similar balancing robot with cascade PID but mine always comes back to spot and it is diffucult to control it to move because after stopping it always turn back. In your video I can see that your robot always stays in new place after pushing.I used DC motor instead of steppers and I measure the speed with encoders but I think it doesn't matter. The PID look and works the same...

In fact my robot has the same behavior that you describe, in the video that you see me pushing it with my hand it is working with only one PID stage, this way it don't return to the starting point.I don't see why is this a problem for you, can you explain?

I wanted to make it move forward and backward (spins and steering I want to add later) with my TV remote control and IR receiver. I did it in this way:I use three buttons -" forward", "backward" and "stop".When I press "forward" button I set the setpoint for speed PID for a particular positive value . Then when I press "stop" button I set this setpoint to 0 as it id by default when robot doesn't move. "Backward" works same as "forward" but i set the negative value as a setpoint.

Speed PID calculates the leaning angle for angle PID to start move, but it doesn't move slightly. It moves for exaple forward but after a while it looks like it was trying to stop (it slows down) and then it accelerates again but with a greater speed. After several such oscillations it falls down.

Also when I press "stop" when the robot moves forward or backward it usually turns back to the place where it was before doing movement forward or backward. For instance it goes 1 meter forward and after stopping it goes 1 meter backward so effectively it stays at the same place.

I don't know what I am doing wrong. I just change the setpoint for speed PID just like you wrote in your post. Maybe I also should make some stopping routine, not just immediately change the setopint to 0, but for instance decrease it slightly until it reaches 0?

As you can see it moves with oscillations and it also doesn't brake slightly. It looks like my PID, especially speed loop isn't tuned well. When it tries to achieve a setpoint speed it overshot and in effect it slows down and undershots and this process is going forever. The same situation is during braking. When I change speed setpoint to 0 and current speed is much greater than 0 it tries to achieve 0 very quickly so the robot brakes very sharply and becomes unstable.

I have no idea how to tune it to work well. When it stands on the spot it looks good. It's very difficult to tune PID while robot moves... :(Did you have such problems during tuning your PID including the speed outer loop? Maybe you did it in some special way...

It looks a PID tuning problem,I've seen similar behavior in my robot but with lower oscillation frequency. I don't have any special way, just trial and error...I my case the speed set point is increased/decreased, mainly because of the steppers, they slip if the speed increase/decreased is to high. Try it in your case.I'll try to make a video of the push behavior with the 2 PID stages for you to compare, it is very distinct.

You mean you don't just give the set point as a particulat value, but you increase or decrease it gradually until this value is reached? I mean it doesn't jump from 0 to 100 but it is incremented in each loop as long as it isn't 100. (100 is only exemplary value).

Thank you very much for the video. You robot turns back to spot so slightly without any oscillations. My robot also turns back to spot after disturbance but with oscillations. Now I know how should it look like. Maybe I will tune this PID some day :) Thank you :)I would like to ask you only one more question. In situation when you don't control your robot (like on video), just it stays on the spot, the setpoint for speed PID loop is always 0?

Hi. I succeded to make my robot move smoothly. It was just the matter of PID constats. Now I have my own PCB and Raspberry PI connected to it with Serial so I can tune it from my PC's keyboard (VNC serer). I've spent some ours and I achieved followin result:http://vimeo.com/113750618

I think it works quite well. Now I can move on and add some visual algorythms with OpenCV. Thank you very much for your help :)

I´m doing a self balance robot´s project.I saw that you tuned the PID gains by bluetooth.In PID Cascade you have 6 gains. How did you tune that? Wich on did you tune first?(Angle gains or speed gains?)Can you show me you technique ? I having problems to tune my cascade PID.

Hey! Your earliest video of the balancing robot tracking and following the ball really impressed the heck out of me! Pretty soon afterwards I set about building a similar version of your bot, sans the self balancing part. Although I seem to have hit a big of a roadblock and was wondering if you could help me out. I've used an RPi2 + OpenCV + Python to build the project, and have come to the stage where the pan-tilt servos can accurately track a color filtered object. However, upon integrating the ultrasonic sensor for corresponding distance measurement, the RPi simply cannot handle the processing requirements, and the setup lags very badly. Do you have any thoughts?