Share this:

This is a project in development for the module “Digital Ecologies”, at the Bartlett’s AAC Msc.

A Delta-Robot is controlled by a Kinect through Processing and Arduino. The movements of the performer control directly the position of the robot’s effector, and the rotation and opening of the gripper.
Once the plattform is properly calibrated (still a little rough round the edges!), several autonomous behaviours will be implemented.

Share this:

Cool to see what people are doing with robots + Kinect. I did something similar for an education outreach event a few weeks ago: http://vimeo.com/19776225. There’s something cool about turning gestures into robotic motion.

http://fatcat1111.pip.verisignlabs.com/ fatcat1111

Best part of this is when the operator mentally goes from “I want to control this” mode to “I don’t want to control this” (in order to reposition the target), and is surprised when the robot continues reacting to him. I imagine that they will need voice and gesture commands to switch it to standby, hopefully before somebody gets their eye poked.