Hi, I would like to try to use an Apple Trackpad to control the Crazyflie, but since I do not have a Crazyflie I do not know how well this will work in practice.

The idea is to have the following commands:
1) Swipe two fingers up and down to control the thrust
2) Swipe two fingers left and right to control the yaw.
3) Swipe one finger up and down to move forward and back.
4) Swipe one finger left and right to move left or right.

So far detecting the finger motion has being surprisingly simple. I am now trying to integrate my controller into the cfclient. In particular, I am trying to implement something similar to input.JoystickReader.

That's a great idea - I'm interested in this as well, and would like to use what you've developed so far (and possibly help contribute). Those are great questions and I'm interested in the answers as well - it'd be nice to learn about the platform while we're waiting for the hardware.

Will you be checking your code in to the repository or at least forking it publicly?

Hi Mike,
I am more than happy to share what I am doing, but I would like reach a point where at least things do not crash even for the most basic functionalities. This is one of the good parts of open source development: since everybody can see their code, developers are ashamed of sloppy hacks and strive for quality.

The public fork is a good option, but, as I said, give me some day to reach a point where it makes sense to show the code to the whole world.

Cool, of course, commit when you're ready. Just wanted to let you know I'm interested and willing to collaborate.

I took a cursory look at the repositories and hope to dig in some more this weekend as well. I'm curious how much work it'll take to get things up and running on OS X - my main concern is the USB radio support. If it's really going to be a lot of work, I may start off with Linux running under Parallels... then after I verify the whole system works, switch over and start hacking on OS X.

Thanks for the reply and I look forward to hacking on this thing with you!

Great, I think we have our first community development started. I'm not the python and PC expert so I hope Arnaud or Marcus will jump in. I know Arnaud quickly tested the cfclient on OSX but had problems with the joystick driver. Other then that I think it worked pretty easily.

Implementing what you want should not be any problem as long as you can translate the commands into roll, pitch, yaw and thrust.

tobias wrote:
Implementing what you want should not be any problem as long as you can translate the commands into roll, pitch, yaw and thrust.

Tobias, thanks for your answer. Yes, that is the key point. I would like to know how those 4 numbers look like in practice:
1) What is the expected range?
2) If, for argument sake, I have got something like this: roll = 0, pitch = 0, yaw = 0, thrust = 100 (totally made up number). Does it mean I am hovering? How about roll = 0, pitch = 30, yaw = 0, thrust = 100? Does it mean I am going forward? And after that, if I go back to roll = 0, pitch = 0, yaw = 0, thrust = 100, will the Crazyflie stop and hover in position?

I see from the link that you are planning to detect the raw trackpad touch location in Python. I was actually thinking about taking a different approach - building a native OS X app, and use NSEvent to get swipe, scroll, pinch, and rotate gestures; then, converting those gestures into appropriate pitch, yaw, roll, and thrust values, and feeding that into the Python library.

With that approach I think you could do some really neat things - for example, imagine a 2-finger swipe left would move the crazyflie to the left, with the initial velocity matching the gesture velocity, and the deceleration somewhat matching how OS X gestures decelerate. A rotate gesture could rotate the crazyflie on its axis, a pinch out could increase the current velocity, a pinch in could decrease the current velocity, etc.

So, basically you'd be flying by continuously swiping/pinching/rotating, and each gesture would change the crazyflies current heading by a relative amount appropriate for the gesture. We'd be leveraging OS X to interpret the gestures, and I think it might provide for a more controlled flight, perhaps even more so than using a PS3 controller.

I will try to prototype something in the next few days... if you're interested in this approach, I'll throw it up on GitHub and can provide you with a link to the repository...

roll, pitch should be floating point numbers in degrees where 0 would be level.yaw should be a floating point number in degrees and is normally the turn rate.thrust should be an 16 bit unsigned integer.

Hi Tobias,
this helps thanks! Meanwhile I have a semi-working solution. Reason why I have not posted anything yet is because I am struggling to integrate with the InputConfigDialogue class. This might even be a Mac OSX specific problem, though, on linux I see a different behaviour.
Still investigating.