Improving sports performance with a Kinect

As a recent Mech E grad, [Alessandro Timmi] knows a lot about moving bodies. His thesis, Virtual Sensei, aims to quantify those movements for better coaching and training in martial arts.

Virtual Sensei uses a Kinect for motion capture during training. From there, the skeleton recorded by the Kinect has a little bit of processing applied and the speed of the fists is calculated. Check out the demo vid for a much better explanation of what Virtual Sensei can do.

Considering the number of sports that require precise alignment of the skeleton and timing of certain movements, we’re thinking this could be the breakout (non-video game) app to get the Kinect into the wild. Golf pros would love to record the swings of their students to make sure their shoulders are aligned.

Most of the Kinect hacks we’ve seen are either robot builds with a few 3D scanners and virtual wardrobes thrown into the mix. Virtual Sensei is a pretty impressive piece of software and with a few additional sports could make a killing.

10 thoughts on “Improving sports performance with a Kinect”

I wonder how limited this is by the 30fps limit on the Kinect’s webcam. Anyone with any decent martial arts training is going to be much, much faster than the guy in the video. At only 30 fps capture rate, I’d be very surprised if even moderately skilled martial artists wouldn’t be punching at a frequency above the Nyquist limit for this camera.

Perhaps a bit of “form before speed”? Your right though that the system will quickly reach a limit but if it is fine grained for small movement tracking it might see that when you slow yourself down your form is all wrong. Once you get to fast this could always be combined with things like accelerometer gloves or force plates to see what speed you got once you blow past the camera.

Might be good for beginners or as a demo “Get into Marital Arts” booth.

I know what you mean. The classes I took always focused heavily on form with the understanding that speed would come in time. At the very time when this software might be usable, most of my instructors would have been dead set against me paying any attention to my speed or kinetic energy. Also, it may provide me with a good playback of my form but if you still haven’t mastered the form of a move/form/kata then you aren’t really qualified to judge your own performance…

I suppose it could be used to record the motions being performed by an instructor for the lower level students to study though. At that point, you could even write software to compare a student’s performance to the instructor’s and provide a % accuracy comparison.

Initial reactions: recording looks pretty standard. It appears to use the algorithms in the OpenNI sample programs to capture and track the user’s skeleton, then saves that and does some basic physics calculations based on mass and height to get kinetic energy.

The tracking has the same problems that other Kinect programs sometimes have, most notably it can lose track of the user. A fairly simple forward kick caused my front leg to occlude enough of my body that it lost track of everything except my rear leg.

The loader seems to have trouble if the recording contains “multiple” users.(the occluding leg problem, for example, caused “blue” me to leave the recording and be replaced by “green” me. The loader hangs every time I try to load a recording where something like this has happened.)

Colecoman’s concerns about not being able to capture fast movement turn out to be entirely justified: In the successful recording of myself doing a front kick, the stick figure’s leg barely twitches for a move which brings my entire right leg around 120 degrees and up to chest level, extends the leg, and then closes the knee.(admittedly, hitting something with this would likely break my leg. I have no formal training, but that’s not what we’re testing here.)

The capture itself is marginally easier to set up than MikuMikuDance Studio, and probably easier to get “good” capture data out of. The interface could use some work, though, most notably condensing all of the separate windows into a single unified interface. Like with MikuMikuDance, I have to wonder if it would be possible to improve capture quality with a weighted average between two Kinect sensor streams, which is an area of research I’m planning on looking into once I’ve gotten sample programs of my own to compile and run.

Interesting to hear. It’s a shame I was right about the kinect camera. One possible alternative might be to develop similarly functioning software using something like OpenCV rather than the kinect. That way, you’d be able to make use of a much larger selection of PC cameras. I know that the Sony EyeToy camera can record up to 120 fps 320×240 and 60 fps at 640×480.

Of course, with that much more data coming in (and without the dedicated processing hardware built into the kinect) you’d need significantly more powerful computing hardware to perform the calculations. Of course, those calculations don’t need to be in real-time so you could just record the video and let it do it’s thing over time.

The advantage of the Kinect is that it’s a depth camera, which is how it’s able to extract pose information. If you were using a PS Eye, you’d probably need at least 2 of them, calibrated for location(not difficult with QR markers), and the capture-ee to wear clothing that contrasts with the background. At that point, you’re rapidly losing the advantages over a standard mo-cap system.

If anybody without a Kinect, without the required space, or who just doesn’t want to record themselves dancing about like a ninny is interested, I can upload recordings to my personal site. They seem to average about the size of a small .jpg file.

This is really neat…if I still wanted to to martial arts I would love this…back when I actually wanted to do Karate, the instructors were all preachy, spending more time talking about religion than form…Karate was really popular with kids back then, all my friends started classes at about the same time…most of us left because we never did anything but sit on mats, the rest got pulled by their parents for religious reasons (it was like Shinto sunday school). Not one of us even got the yellow belt with the green line.