No problem, I am new to all this too, and just got my first Arduino and MPU6050 sensor a few weeks ago, so this is all very fresh in my mind on exactly how to get it all working. :-) Glad I could help.

/* ========================================================================= NOTE: In addition to connection 3.3v, GND, SDA, and SCL, this sketch depends on the MPU-6050's INT pin being connected to the Arduino's external interrupt #0 pin. On the Arduino Uno and Mega 2560, this is digital I/O pin 2. * ========================================================================= */

/* ========================================================================= NOTE: Arduino v1.0.1 with the Leonardo board generates a compile error when using Serial.write(buf, len). The Teapot output uses this method. The solution requires a modification to the Arduino USBAPI.h file, which is fortunately simple, but annoying. This will be fixed in the next IDE release. For more info, see these links:

// uncomment "OUTPUT_READABLE_QUATERNION" if you want to see the actual// quaternion components in a [w, x, y, z] format (not best for parsing// on a remote host such as Processing or something though)//#define OUTPUT_READABLE_QUATERNION

// uncomment "OUTPUT_READABLE_EULER" if you want to see Euler angles// (in degrees) calculated from the quaternions coming from the FIFO.// Note that Euler angles suffer from gimbal lock (for more info, see// http://en.wikipedia.org/wiki/Gimbal_lock)//#define OUTPUT_READABLE_EULER

// uncomment "OUTPUT_READABLE_YAWPITCHROLL" if you want to see the yaw/// pitch/roll angles (in degrees) calculated from the quaternions coming// from the FIFO. Note this also requires gravity vector calculations.// Also note that yaw/pitch/roll angles suffer from gimbal lock (for// more info, see: http://en.wikipedia.org/wiki/Gimbal_lock)#define OUTPUT_READABLE_YAWPITCHROLL

// uncomment "OUTPUT_READABLE_REALACCEL" if you want to see acceleration// components with gravity removed. This acceleration reference frame is// not compensated for orientation, so +X is always +X according to the// sensor, just without the effects of gravity. If you want acceleration// compensated for orientation, us OUTPUT_READABLE_WORLDACCEL instead.#define OUTPUT_READABLE_REALACCEL

// uncomment "OUTPUT_READABLE_WORLDACCEL" if you want to see acceleration// components with gravity removed and adjusted for the world frame of// reference (yaw is relative to initial orientation, since no magnetometer// is present in this case). Could be quite handy in some cases.//#define OUTPUT_READABLE_WORLDACCEL

// uncomment "OUTPUT_TEAPOT" if you want output that matches the// format used for the InvenSense teapot demo//#define OUTPUT_TEAPOT

Send any character to begin DMP programming and demo: Initializing DMP...Enabling DMP...Enabling interrupt detection (Arduino external interrupt 0)...DMP ready! Waiting for first interrupt...ypr0.09-2.438.77areal-1113792561ypr0.07-2.458.82areal-1444643145ypr0.05-2.468.86areal-1635243579ypr0.03-2.488.91areal-1745623896 Is ypr about the gyroscope? Howerever the areal values are still strange...

Yaw/pitch/roll is presumably found by a fusion algorithm contained within the DMP, so it uses both the gyroscope and accelerometer to determine the yaw/pitch/roll. Remember, the gyroscope does not actually determine angular position, but it measures angular velocity. To determine angular position from angular velocity, one must integrate the values over time from the gyroscope to find position. This introduces error, because of noise and imperfection in the sensor. The accelerometer should always read 1G of force in the Z direction when the Z-axis is normal to the ground plane. With this information you can also use an accelerometer to determine angular position, but accelerometers are very noisy so their data alone is not great either. But the fused algorithm presumably is much better.

To get that fancy display to work, you need to have the "Processing" working on your computer, and then find some libraries for it.I don't recall all the details now, but it was quite a hassle to get it to work.

oh, also, the accelerometer values, I forget what units they are in, but they are not in G by default. I don't recall what it was, but I had to convert it by dividing by 28xx(don't remember what "xx" values are). My conversion and code are at home, I will look for it later. If you don't need an EXACT value now, just let the sensor sit still and take an average value of the Z axis (make sure the sensor is as flat as possible). This value should be about equal to 1G of accleration, so use that number to divide all of your accelerometer values before printing them to the serial port and they will be in G's.

To get that fancy display to work, you need to have the "Processing" working on your computer, and then find some libraries for it.I don't recall all the details now, but it was quite a hassle to get it to work.

You don't need processing to get the display in the video I posted. That is C# code I wrote and OpenTK rendering the cube based on the yaw/pitch/roll values output from the sensor. It did take a decent amount of work getting it working, only because I had never tried 3D rendering before so I had to learn about that and the different libraries out there and finally chose OpenTK (which is free).

I found the toxi libraries for the processing sketch. But I'm still not understanding how to convert the raw values to the real position of my board on the space and if it's possible. Can you explain this to me?

What exactly are you looking for? Accelerometers measure acceleration, and gyro's measure angular velocity. With the code that you have, you can get yaw/pitch/roll, and acceleration. If you are looking position as well as orientation, you will need another sketch, or to do your own math's on the data from the sensors. Position can be found by integrating velocity, which can be found by integrating acceleration. Ideally you will want to use a fused data algorithm for this and not the direct data from just the accelerometer. A complimentary or Kalman filter is probably your best best.

Search FreeIMU on google and download the library. The fused algorithm there is completely open-source so you can try to decipher it yourself. It does not include code to calculate position though, so you will need to look elsewhere for position sensing.

My code is not for Arduino. It is a C# project written in Visual Studio 2012 with OpenTK library used for openGL rendering in 3D. The C# program opens the COM port that arduino uses and reads the lines as they come in. If you have Visual Studio 2012 and have a basic understanding of OpenGL and 3D rendering, you might be able to get my C# project to work on your computer, otherwise you are probably better off trying to get the processing code to work.

I'm having problems getting toxi.geom to be 'seen' in my processing sketch. Downloaded from github and "I've stuck it all over the place" but still tells me its missing. I'm not really familiar with Processing so it could be a basic error but...