Posted
by
timothy
on Saturday February 12, 2011 @03:38PM
from the but-that's-barely-a-kinection-at-all dept.

An anonymous reader writes with this snippet from Kinect Hacks: "Being able to send gestural data captured from your Kinect to another device via your computer of IR is incredible. You can send gesture recognition data to any piece of hardware that uses IR signals, such as your television, receiver, cable box or X10 extenders. Anything that reads IR signals can now be controlled by simply using gestures to control the devices. Absolutely amazing. The developer wrote custom code that works with his Kinect sensor plugged into his Mac Mini. The code is integrated with OpenNI which detects the user's skeleton and has specific gestures pre-programmed to control his TV in order to turn it off and on along with changing the volume on his digital receiver. Other gestures include the ability to change to the next and previous channel."

Where a device that costs $149 can be hacked to..... let you channel surf even more easily. Then you can make a blog all about hacking a product and promote yourself to get more attention than a third world development charity. This is what hacking is all about.

The machine was rather difficult to operate. For years radios had been operated by means of pressing buttons and turning dials; then as the technology became more sophisticated the controls were made touch-sensitive--you merely had to brush the panels with your fingers; now all you had to do was wave your hand in the general direction of the components and hope. It saved a lot of muscular expenditure, of course, but meant that you had to sit infuriatingly still if you wanted to keep listening to the same program.

Sure you have to do a complicated dance routine to flip through channels and change the volume, but maybe, one day soon, we'll be able to operate basic television controls from across the room using only a single thumb!

Wake when my TV can figure out what I want to watch and puts it on before my ass hits the couch.

Wake when my TV can figure out what I want to watch and puts it on before my ass hits the couch.

Actually I wrote something similiar like this 2 years ago when I was still watching regular TV (as in Zapping through the channels).

Basically you pressed a button which triggers a script running on a nslu2 slug, which then gets a list of everything that is currently on TV (RSS Feed from the internet) and then filters it down based on some rules you specified.

First it tried to find TV Shows like Simpsons, Family Guy, American Dad etc. if nothing was found it filtered the running movies by starting time (so y

If he's talking about computers: mod parent down.
Anyone who knows how to use them will tell you mouse gestures are far and away the most efficient way to browse the net. Especially if you use lots of tabs and sift through large numbers of pages.

Only once this kind of technology feels like using an iPad from a distance will it actually be useful. Until then this is no "game changer". That said, the combination of technologies that makes the present project work is impressive and all the best to the developer.

It is and will be a game changer for the disabled and environments where the users is not free to use their hands. Surgical and machinery operations spring to mind. Now that proof of concept is complete, the prototyping to production timeframe will be short. The interesting part will be providing a failsafe algorythm to ensure interpretation of the users intent is accurate.

While Kinect looks really cool, I am impatiently awaiting the Asus Xtion, which is the same hardware as the Kinect except it is not an XBox accessory, and you do not have to give any money to Microsoft for it. The developer version , the Xtion Pro, should be out any time now according to their official schedule. There is also now an OpenNI API for communicating with Kinect family devices, which is available for Linux as well. Hobby robotics vision never seemed as promising as now.