Slashdot videos: Now with more Slashdot!

View

Discuss

Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

siliconbits writes "Texas Instruments wants to deliver a Minority Report-like user interface by combining its just-announced OMAP 5 platform, which is based on two Cortex A15 cores, with one of its own DLP pico projectors and a camera. The US semiconductor giant wants to pioneer the use of so-called next generation natural user interfaces by adding hardware support for stereoscopic 3D, gesturing including proximity sensor and interactive projection. This is reminiscent of the SixthSense, a wearable device invented by Pranav Mistry, which was demoed back in March 2009 by the then-PhD student of MIT's Media Lab Fluid Interfaces Group at TED."

What's this obsession with Minority Report interfaces? They really don't look very ergonomic to me - I'd imagine all kind of law suits if people had to use these on a regular basis. Just because it's been in a (shitty) sci-fi film doesn't mean it's a great idea.

If you're going to make tech based on popular films you should really start with the Hoverboard and Mr Fussion first. Once you've got those sorted then you can make dumb arse UIs.

My impression is that neural interfaces are currently in something of a bind: The effective "bandwidth" we can presently achieve without invasive surgery or big, pricey, MRI setups is unexciting.

For the small number of patients so thoroughly paralyzed that any of the minimal-motion input mechanisms are impractical, they are still quite useful; but being stuck as a "medical device" that is only really relevant to a small number of seriously impaired patients is not a recipe for high speed development and

OK, I'll grant that the film wasn't great. That said, the core ideas were excellent. I don't think I have seen a single movie yet that lived up to a P.K. Dick short story, let alone novel. (possible exception for Scanner Darkly - and they even messed that one up). Anyone know who has the film rights to "Man in the High Castle?

While I think that standing in front of a 'computer' and waving my hands looking retarded isn't the best way to interface with a machine, it would get folks off the couch and at least moving more than their fingers.

Just don't make it mobile. People look stupid enough with bluetooth head sets as it is.

Hands off my chair; when I'm ready to move around, I'll get up and move around.
Outside of it just being a novel interface, I honestly can't see any application where this type of UI is more practical or useful than what presently exists, but I can see several where it'll make general computing a lot more difficult. Just because the mouse has been around with us for a while doesn't mean that any conceivable alternative to it is an improvement. Think I'm going to have to pass on this, and hold out for the n

"While I think that standing in front of a 'computer' and waving my hands looking retarded isn't the best way to interface with a machine, it would get folks off the couch and at least moving more than their fingers. "

...which would probably be a net decrease in productivity, unfortunately. While I think getting people up and moving around more would be healthy, I don't know how efficient it would be to introduce large limb moments to work flow. Here is my thinking as to why:

Once you have the infrastructure for this there's nothing stopping you from fine tuning the stuff to look for your wrist movements.
You can even assign different actions depending on what you moved your wrist for a fine control of something or your entire arm, etc.

For ages I've been wondering why processor power has increased many times over but input devices have hardly changed at all.

We're starting to see that change with things like the Kinect and multi-touch on smartphones. Now this news looks like it could be a next step--holographic input. In some ways that is conceptually combing the Kinect model with the multi-touch model in 3D space.

It is gratifying to me to see input devices finally start to take some lurching steps forward. It is still unsteady to

But after using android for a while I can certainly see where different input methods have value. Also, after seeing what some hackers are doing with Kinect it really opens the mind up a little in terms of what is possible.

I'm not suggesting the death of the KB mouse hegemony but maybe there is room for some other effective input methods as well...

If someone really came up with better input devices, the pro computer gamers would be using them.
As for user interfaces, Starcraft players seem to manage very many actions per second.
So the challenge is to create a user interface that's friendly and usable to "noobs" but also able to augment pros/experts to their limits.
Most recent UIs seem to emphasize friendly and usable to "new users", but they neglect the case where some of those new users don't mind taking the trouble to learn to do things very very much faster.

current games are designed around KB mouse input, the kinect stuff is programmed to take advantage of the hardware. give it time.

I've been too comfortable, sitting in a big desk chair all day, hardly moving a mouse a few inches to summon an entire world of information. What I really want is to be standing up all day in front of a huge transparent screen, holding my hands up in the air and wildly waving my arms around. That would be a huge improvement.

holding my hands up in the air and wildly waving my arms around. That would be a huge improvement.

Yeah, but just think of the benefits: if you walked down the street wildly waving your arms around ten years ago people thought you were a lunatic, if you do it ten years from now people will think you're a poser with the latest Apple iThing.

I still get this with bluetooth headsets. See someone walking alone down the street arguing with them self for a few minutes. That little blue LED really needs to stay on during use instead of blink so we can more quickly identify the real schizophrenics.

I still get this with bluetooth headsets. See someone walking alone down the street arguing with them self for a few minutes. That little blue LED really needs to stay on during use instead of blink so we can more quickly identify the real schizophrenics.

I'm not sure I get this - do all schizophrenics have blinking blue lights, or is it just you?

>>>I've been too comfortable, sitting in a big desk chair all day, hardly moving a mouse a few inches

Precisely. My arms already hurt just imagining waving them around all day in my office. I prefer the current interface where I just lay my arm on the table, and barely move at all except to adjust the mouse & click. NON-labor rather than wearing myself-out like a factory worker doing repetitive motions.

Minority Report was NOT a camera-based gesture recognition interface!There is a control glove! 2 of them!Every single time someone does this, it is always gesture-based camera systems and they completely forgot the hand-piece!

Quit that shit already and go back and see it.Here, better yet, the video of the exact scene.Minority Report viewscreen [youtube.com] Someone e-mail EVERYONE, EVER.

Gesture-based recognition systems are terrible, they really are. They have absolutely no precision, at all.I wish people would get away

Sure, sure, not enough foresight but it still feels like a solution in search of a problem. Anything that impairs a person from using a computer will probably be an expensive enough procedure that they would have assistants anyway except for a car mechanic and even then, what would the benefit be over, say, voice recognition or designing rugged, water-proof and washable interface hardware?

First we got Bluetooth headsets which made people talking on the phone indistinguishable from schizophrenics, now we get this new GUI technology which makes computer users indistinguishable from spastics... what next?

Yeah, we know, for all the data entry, heavy computer use/coding jobs an interface like the Minority Report one is stupid and inefficient. But at least it's a step in the right direction. For those of you that have been posting about how nice it is to sit in an office chair and click a mouse all day, I have to ask what kind of ecstasy you're popping to make that a pleasant experience.

I fucking hate the typical cubicle set up. It makes my lower back hurt after 5 hours. My shoulders constantly ache from being crouched over a tiny POS keyboard. My right hand is significantly fatigued by the end of the day, whereas my left elbow is killing me from leaning my weight on it constantly. Sitting on my ass all day puts enough pressure on my prostate (okay, I have a bony ass) that I wonder if it might have negative effects on the libido. Squinting at monitors that blast my face with bright white backgrounds all day makes me hate the color white.

When it comes down to it, the typical cubie setup is a completely shit arrangement to be stuck in for 8 hours on end. So to fix it, we have to get up every hour and stretch or walk about for 5 minutes. We pay for massages and chiropractic adjustments to fix our strained backs. And don't even get me started on the amount of tension we hold in our jaws. What TI is proposing, here, is an interface that could let you stand up for a period, then sit down for a period. It could use an 'on-screen,' split keyboard for when you need to type in info. It would be capable of projecting onto a cubicle wall, so I wouldn't be tethered to a monitor on a desktop. It will be clunky and a PITA for the first generation or two, but give it some time and some prototypes and we could really be working our way towards a useful engineering interface that frees us from the ergonomic hell that is the cubicle lifestyle. Personally I'd like to see more research go into this field.

I know a lot of engineers that wet themselves when they watched the first Iron Man where Tony Stark designs an exoskeleton using a 3-D holographic touch interface. That was damn cool and you know it. If projects like the one TI is proposing get us closer to that kind of modeling interface little by little, I'm all for it. Bring it on TI!

(Long story about poor management as the cause of poor ergonomics in the workspace)... What TI is proposing, here, is an interface...

(Long list of interesting technical possibilities)

Bring it on TI!

The problem is you (probably) don't work for TI, you work for the guys who simply don't care if they destroy your body. And a new technological way to do it, is just going to be a new technological way to destroy your body.

Your management will simply force you to wear extra heavy boxing gloves on your upraised hands. Maybe tie your work metrics to how many times you lift a weight using your back instead of your legs (yes I am well aware that is "wrong"). Maybe to make it sense you better, your boss will

OMAP 5 is scheduled for release in the second half of 2011. Has anyone seen one of these in action? Not minority report whatever, just dual A15 cores running anything?

And is this the first product to reach even early production with an A15 core or cores? Is there any A15 presence around other than this? I see NVidia is planning something, but it's still vapour, right?