Autonomous tank terrorizing campus

As a senior design project for ECE4007, [Nate], [An], [Chris], and [Wink] built an autonomous toy tank. It is using a Panasonic IR motion sensor to find targets, then once it’s facing the target it switches to visual motion tracking through it’s web cam. If it can get close enough, it will stop and begin rotating the turret for more accuracy. Finally it fires a pellet. It’s brains are an ICOP technology eBox-2300 running windows CE. All of the programming is available on the site, as well as a breakdown of the various sensors and hardware. As you can see in the video after the break, it does a decent job. Given some more time, we’re sure they could speed up the target acquisition process. Maybe we should add a category for GeorgiaTechfinalprojects.

I like the idea of just letting a few of these loose on campus… they might not be dangerous, but there is something to be said about little machines that track anyone they sense, follow them, and shoot a pellet at their knee.

That is awesome. But it needs to shoot a lot faster to chase the people around. Whoever made this should take it to japan and let it go in the streets, after gps is installed so it can hide out when it’s battery is low.

Why not just oxy weld up some 3 – 5 mm thick plate and use that as the body, then if someone does step on it then it survives, only extra would be to get a stronger motor so that it could pull the weight, also need some sort of suspension inside to make it absorb collisions if someone kicks it.

Seems from the vid that the target needs to wear a red wrap around his leg and help along a lot, so that’s a bit less impressive than the writeup makes it seem.

This makes me think, I wonder if you could hack something on a digital camera’s software, most point&shoot cameras now have face-recognition and even smile recognition, and focussing, often highlighted with squares on the display, so if you could somehow tap into its output you might have a pre-made targeting system of sorts, like for instance take the video-out, apply a filter to find the rectangle, adjust a motor to center it (or adjust to straight below the square to hit the body if it a device which shoots something like pellets). That would disregard the distance info though, but you can go more complex I’m sure.

Yup. make it a battlebot, aluminum and steel. so even the atypical angry moron that tries to kick it ends up with a bloody toe.

secondly, make it shoot industrial earplugs.

Way, I did that as my last year thesis project in robotics. Except it ran linux (1.8 kernel YEAH! and used a pair of video cameras to a framegrabber board (switch between left and right to get a stereo image to process) to target.

P.S. it’s very easy to write software to lead the target as they run away. and if you spend more than $6.00 on your servos you can aim faster than a human can run. This is a poor kid project, using a toy model tank… wow now N00b.

My biggest problem was the long recharge time and the 260 pound weight. after firing 20 earplugs it had to run the compressor to recharge it’s air supply. (held 100 earplugs in the hopper) and the 4 car batteries gave it enough heft to make us laugh at any loser that tried to kick it.