Hitachi Gesture-Based Interface: Why Do We Hate Buttons So Much?

Seriously, will the future be button-less? What’s up with this surge of motion- and gesture-based UIs? Aside from Microsoft and Sony working on motion-based gaming controllers, Hitachi is also currently working on a Minority Report-ish interface. The company plans on using the technology for digital signage, and – this I can understand – in the medical field, to enable doctors to manipulate data without actually touching the monitor.

Here’s the interface in action. It’s still in development, hence the delay in response, but it does work:

What I can’t understand is why Hitachi is planning on incorporating this on desktop PCs and even TVs by the middle of next year. No doubt, gesture-based technology is useful in some instances, but does it really have a place in everyday usage? Will our lives be more awesome if we could wave our hands in front of our PCs just to view pictures and zoom in on maps? Have I just become too old to appreciate new technology? What the hell is going on?!