Slashdot videos: Now with more Slashdot!

View

Discuss

Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

jfruh writes "More than a decade ago, the special effects artists working the Steven Spielberg film Minority Report synthesized experimental thinking about GUIs to produce a floating interface that Tom Cruise manipulated with his hands. In 2013, surrounded by iOS and Android and Windows 8 devices, we use stripped down versions of this interface every day — and commercial artist Christian Brown thinks that's a bad thing. Such devices may look cinematic, he argues, but they completely ignore the kinds of haptic and textured feedback that have defined how we interact with devices for centuries."
Speaking of Minority Report interfaces — a new armband sensor using a gesture-based control scheme is the latest gadget to invoke references to the movie.

At the outset, Beta had slightly higher video resolution than VHS. VHS had 2-hour tapes rather than Beta's 1-hour tapes.

How do you say which was "better", objectively? The ability to record a movie while you're out of the house (impossible with a 1-hour Betamax tape) is a huge deal. Not having to turn on the lights and switch tapes halfway through a horror movie (and ruin the mood) isn't nothing. Having the video store's inventory take up half the room is a big deal.

By the time Beta II speed finally allowed 2-hour tapes, it was competing with VHS HQ. At that point the video quality difference (which was always pretty small to begin with) between VHS and Beta was negligible and depended more on the quality of the player and tape than the format. Meanwhile VHS had added 4- and 6-hour modes.

And by 1984, Betamax VCRs were selling for about half the price of VHS players and still couldn't get any traction.

While LCD monitor makers are striving to improve contrast ratios and reduce glare – blacker blacks, broader viewing angles and deeper, more vivid colours, futurists envision a world of high glare, transparent monitors where ambient lighting and artifacts on both sides of the glass wash out contrast and colours? Absurd.

how often do you see a cell phone, tablet, or even laptop with a matte screen? They're almost all high glare nightmares.The makers have ignored the best way of reducing glare because a shiny screen looks better, and therefore sells better, right up until the point where you try to actually use the thing.The only way around it is to crank up the brightness to try to overcome the glare, kills battery life, but it's worth it for a shiny screen when it's off right????

I came here to reply to the first truncated line of your comment... but then I was greeted with the second line that said exactly what I intended to say (that's it's marketing designed to sell until the honeymoon is over).

I don't buy that "crispness" argument for a second. I have 3 matte screens in front of me at home, and the picture is plenty crisp. I can also see what I'm doing even with the lights on.There is absolutely no excuse for shiny screens. I've never talked to anybody who prefers them, but marketing departments obviously do...

I have a shiny CRT screen (glass) and a matte projector screen, both HD, both linked up to the same sources, and I find the projector version much clearer in almost all scenarios. The reflective glass simply is not helpful no matter what they say.

Or we could save the money on that product which is bound to be a huge cost sink and just use existing matte technologies....

In fact such a technology exists on almost all modern TVs. they have a "store mode" and a "home mode" the difference is the store mode runs at max brightness at all times so as to wash out the glare. Often times the "home mode" isn't even capable of getting to the same brightness because they would never get energy star certification if it did. (have you ever wondered why electronics ask you when you first set them up if you are a store or not?)

Replacing a buttons with simple text on them for cute little icons drives me absolutely insane. It's almost like it's a goal to make the lives of people who have to instruct others on how to use their products harder.

"Click the send button" becomes "Ok, do you see the little box with a picture of an envelope in it with some lines next to it.. in the upper left corner of the screen? You don't. Keep looking"

Which is why I am stunned that Cadillac is using this in a car. In fact, they are bragging that this is better than buttons. Because what we need in our cars is more shit that takes our eyes off the road.

Which is why I am stunned that Cadillac is using this in a car. In fact, they are bragging that this is better than buttons. Because what we need in our cars is more shit that takes our eyes off the road.

We know how this will end. Someone will get killed because a Cadillac driver was trying to do something the REQUIRES his eyes to be looking at the dash. A sharp attorney will realize this is a design flaw. They will find email and disgruntled ex-employees that will show this was known in advance, and... well, you know the rest.

I don't see how this would happen. Cadillac's not the first automaker to jump on the touchscreen-for-everything paradigm; BMW and Ford have been doing it for years. If they haven't gotten sued yet, then I don't see why Cadillac would get in trouble for it.

Which is why I am stunned that Cadillac is using this in a car. In fact, they are bragging that this is better than buttons. Because what we need in our cars is more shit that takes our eyes off the road.

Luckily, Cadillac's target market of "old guys, soccer moms in mommy tanks, and cognac-swilling rappers' is known for its superb reflexes and incredibly responsible driving. Nothing bad could possibly happen.

Actually, it isn't. It's easy to translate a word. An hour of a translator's time will easily do an entire UI for a moderately complex application. It's much harder to localise icons. For example, in China a red envelope is a good thing to receive, whereas in the UK it's a final demand for a bill. An owl signifies intelligence in most of Europe, evil in parts of latin America, and stupidity in much of Asia, yet it used to be a very common icon for help screens.

For example, in China a red envelope is a good thing to receive, whereas in the UK it's a final demand for a bill. An owl signifies intelligence in most of Europe, evil in parts of latin America, and stupidity in much of Asia

Yet another reason why there should be one world language (UK English) and one world culture (UK/English).

Agreed completely. I get it when "Settings" gets replaced with a gear icon. That's pretty standard now, and "Settings" is pretty long to put on a button, but on a touch interface I don't get to hover over icons to see what they're called like with a mouse, so a good description or help interface needs to exist.

I must admit I recently started looking a lot into http://en.wikipedia.org/wiki/Common_Desktop_Environment [wikipedia.org] . I was super happy with Gnome 2, my productivity was never higher! Then, *bam* Gnome 3 up my throat, I actually tried to use it for a month but it was too painful - it was slow as hell, crashing all the time. Now I'm with KDE 4, it is not as fast as Gnome 2 but, feature-wise it is in a entirely different league. Still, I feel I don't use most of its features..

The other day I needed a fancy way to visualize data in a gdb session - that's when I found ddd. The Data Display Debugger http://www.gnu.org/software/ddd/ [gnu.org] is written in http://en.wikipedia.org/wiki/Motif_(widget_toolkit) [wikipedia.org] . I was amazed how responsive and fast the GUI was. I found the GUI very well organized and not confusing at all to use. So I wonder, why are we really moving away from this? Why is everything turning into eye candy bloatware?

Well.. Is that a bad thing? I'm not a user interface expert but in my opinion It provides better contrast between interface elements/controls.http://ferret.pmel.noaa.gov/static/Documentation/rostock_paper/gui_main.gif [noaa.gov] Can't you tell immediately what are the controls and what they do? Now compare that with the interfaces that some software companies started pushing into our throats, where you can't even tell the difference between clickable and non-clickable elements! That and the abusive integration of media

They are movies, with fictional stories!The holographic or transparent screens allows to take shots of the actors face.The crazy gestures are so the actors can be emotionally expressive to the viewers.

If durring the 80s we had a movie of 2013, that got it right it would seem comical in the sense where teenagers are getting bullied over a tiny glowing box. And they are all just crouched tapping the little box. There is no emotion for the movies.

That is why they had 2013 with big screen tv that video conferences, it made the antagonist seem larger than life and someone for the actor to react too.

I never understood why anyone thought that the computer in Minority Report was something worth pursuing. Futuristic computers in Hollywood movies have always been designed to look cinematic with no regard for how they would actually function. Having an intuitive interface isn't important for Hollywood directors, having something that is interesting for the audience and makes it obvious what's going on is.

One common example of this is maps. 3D maps are all the rage in Hollywood movies, even when a simple address would suffice. But an address has no cinematic quality, a 3D map does.

Oh yeah, and Keanu Reaves (Johnny Mnemonic) did it way before Tom Cruise (Minority Report).

Both Philip K Dick inspired movies, too. Though I do agree that JM did aspire to the abstract interface pattern better than MR. Personally I was reminded of Snow Crash when I saw JM (with Hiro P and the avenues of cyberspace).

Hollywood is made of shiny visuals. And, of course, designers love good looking form to the point that function can get skimped on. redmond has been doing their level best to serve up their version of MovieOS, down to the security problems.

GUI's have increased productivity. How simply because instead of paying one UNIX guru to do it in 20 years, you can have 20 people do it in one.

That's a very bold claim! GUIs are only very good at one thing, suggesting context to the user. This is the case when the user does not have a very good idea what he wants to accomplish or how to accomplish it. For example, think on browsing the internet.CLIs are very powerful and orders of magnitude faster than GUIs when you have a clear idea of what you want to accomplish and a slight idea on how to accomplish it. This is the case when you are programming, it is the case when you are administrating system

I never understood why anyone thought that the computer in Minority Report was something worth pursuing.

Right now I wish I had an interface like that. I am working with three screens on two different computers side by side and delving into 6 or 7 large documents and programs from disparate sources that I need to troll through in order to make sense of the partial information in each one. If I could have a single virtual desktop that was the size of my real desk and be able to push documents around like in Minority Report then I would be a happy camper. As it is I feel like I am peering through a keyhole at

Yes. No real world crook would take the time to write a virus with a sexy female voice that says "Releasing deadly virus in... FIVE seconds...". Hollywood had it wrong all the way back from the time they decided that there is some Terrorist Bombers' Guild that has standardized the color coding of bomb wiring. If I were a bomber, I'd use purple wires for everything. Try disarming *that*.

I've never fully understood the need to have little lights on every damn thing. My DSL modem has 5 lights on all the time, some of which blink. My Battery Backup has a light on it, my speakers have a light on them, my Monitor has a light in the power button, my XBox power brick has a light that you might be able to see from orbit, the Power strip in the living room has a light in it, my Laptop has a light that 'breathes' when the computer is asleep. 90% of these device are pretty damn obvious if they are 'On' or 'Off' without a glowing indicator. Hell, I had a router once that had a little blue dome on the top, and inside where about 18 LED's that flashed in some sort of relation to the WIFI activity. Looked like a epileptic stroke at a disco. Later versions of that router came with a button to turn the LED dome off, but the first model just had a little plastic shield you where supposed to clip in place if it was annoying.

Long story short, Foil Tape with tiny pinholes in it has become my friend. I simply cut a bit of tape, poke a tiny needle-hole in it, and affix it over the LED. I can still see the indicator if I need/desire to, but am not barraged with little flickering lights in my bedroom at night.

Yeah, most of the time in the movies they're just trying to kill everyone. So no big loss on me they don't work like that.
Though I admit I would enjoy using Siri more if she were more like GlaDOS. She should at least have a sense of humor if she's going to give me the wrong information.

The biggest problem as I see it is that you can't feel the controls. Like all the interfaces in ST:TNG, there is too much dependence on having to look where your hands are. I think that's a distraction at a very basic level that we haven't fully noticed yet, let alone dealt with in any meaningful way.

Think of your old-school cell phone. You could make a call, even text, without looking at it. (Or, I could. Your mileage may vary, I guess.) Can you do that with your glass-smooth smartphone now?

And yeah, I know. "Siri, Call Police!" "Calling Portobello. When would you like reservations?"

As I see it, the big difference between physical controls and colors and text on a touchscreen is that you can manipulate physical controls while looking elsewhere. There are times when that may be kinda important.

Exactly why I did NOT get a Samsung Galaxy S3 and went for a Samsung Galaxy S Relay instead. Unfortunately, I can't seem to find the Phone key (though for some reason there is an e-mail key, a text message key, function-.=.com, and a voice mode key, all squeezed into the thumbboard). I also have yet to learn to touch type numbers on it.

I gave up. I really didn't want a phone without a physical keyboard. but at the same time, I did want a modern phone, and the manufacturers refuse to sell anything where I am that qualifies as both. The only phones I can find with physical keyboards are a minimum of about 3 generations behind the current phones.So I compromised and gave up on a physical keyboard. Unfortunately I'm now "proof" to these idiot companies that people "want" phones without keyboards, when in fact I'm the opposite, there just wasn

Before I got my iphone, I'd have agreed with you. I seriously thought of ditching my Android for an old school phone with a real number pad. But with contacts and a touch screen that actually work, I hardly ever key in a number now. Full disclaimer: I've never been much of a texter, so can't really compare the interfaces in that context.

Like all the interfaces in ST:TNG, there is too much dependence on having to look where your hands are.

There are some things TNG predicted well, but a few glaringly funny missteps in retrospect. My two favorite are:

1) Piles of PADDs. There's a few scenes where someone is "doing a lot of reading" or "has a lot of reports to file" and so they have a bunch of PADDs strewn about their desk. Little did I know I needed a separate Kindle for each ebook I read.

There are some things TNG predicted well, but a few glaringly funny missteps in retrospect. My two favorite are:

1) Piles of PADDs. There's a few scenes where someone is "doing a lot of reading" or "has a lot of reports to file" and so they have a bunch of PADDs strewn about their desk. Little did I know I needed a separate Kindle for each ebook I read.

Lots of the time, they are cross-referencing things in parallel, which is inconvenient on a single screen of that size. With replicators, PADDs are presumably literally as cheap as dirt, rather than luxury gadgets, so there's no real reason not to have one for each document when you need to do that.

This may still come to pass. I have a massive monitor at office but I've found that using hard copies of specs improves my productivity -- I think it's because it gives me a feel of "where" the piece of information I want to access is. I have to turn my head or move my hand to a separate, physical location in space rather than doing a virtual switch on screen.

If e-readers were to become cheaper and thinner, I'd have a bunch of them on my desk too.

The biggest problem as I see it is that you can't feel the controls. Like all the interfaces in ST:TNG, there is too much dependence on having to look where your hands are. I think that's a distraction at a very basic level that we haven't fully noticed yet, let alone dealt with in any meaningful way.

Think of your old-school cell phone. You could make a call, even text, without looking at it. (Or, I could. Your mileage may vary, I guess.) Can you do that with your glass-smooth smartphone now?

Unfortunately physical buttons are expensive, especially on a device that really needs a touch screen for some things anyway. I clung to my slide out qwerty keyboard for as long as I could, but had to eventually get a touchscreen because that's all the manufacturers want to make.The good news is that it's not a problem that people don't know about. And in fact several companies have come up with various technologies to try to make a touchscreen tactile (I saw one idea that was basically inflatable bubbles under the surface of the screen that could inflate buttons as needed, I believe it was blackberry who a while ago made their whole screen push in like a button when you clicked on it, and of course almost every phone these days has haptic feedback (which I usually turn off as soon as I can)). Unfortunately none of these have worked well yet, but give it some time and we may get there yet.

I do find it interesting that you mention ST:TNG, from what I understand the theory behind their LCARS "touchscreens" was that it actually was tactile, just using a technology that we don't yet have (and that obviously wasn't so visible on screen) with the idea that you could actually have the best of both worlds. A shared console that each user could easily re-arrange for their particular preference, or current task, while still retaining the feel of real buttons. At the moment the idea sounds really appealing, but it's a ways off in implementation yet.

Star Trek is actually a great illustration of this, there were times in the original series where the actors had their hands on controls but attention focused on the action for dramatic effect, they didn't need to constantly look down as in the Next Generation.

Exactly. In the old series, the controls may have been in weird shapes and not labeled unless the audience needed them to be, but they were physical controls, and the odd shapes could actually help the operator manipulate them by feel. All that is lost in modern-looking interfaces.

Star Trek is actually a great illustration of this, there were times in the original series where the actors had their hands on controls but attention focused on the action for dramatic effect, they didn't need to constantly look down as in the Next Generation.

They didn't actually constantly look down in Next Generation, nor did they need to (either actually or in-fiction.) Actually, because the controls weren't on the props but were digitally added in post production, and in-fiction because they used fancy

Not only this, but the ST:TNG UIs had tactile feedback, just like mechanical buttons. They did it with miniaturized force fields or somesuch; it's in the TNG Technical Manual. Obviously, force fields aren't real (yet), just like warp drives and artificial gravity, but that's the official explanation which acknowledges that tactile feedback is desirable in a UI. This tech manual came out around 1991-1992, long before this whole touchscreen tactile-feedback-less craze got started.

Well, sticks and rocks, which can be used as devices. Simple machines (at least since the Grecian days). Also books (primitive information transmission devices, soon to be rendered obsolete). Clocks, microscopes, telescopes... And the list goes on. Many "devices" have been known for quite a while, even if they don't (necessarily) have digital interfaces...

...between reality emulating film and reality converging on film. The former is something that should generally be avoided when it comes to cinematic user interfaces, given that most of them are designed for cinematic effect, rather than usability. On the flip side, there's nothing wrong with the latter taking place if it just so happens that better usability corresponds to something that's shown up in films (or books, or any other form of media) already. We see this sort of thing happen on a regular basis

I remember the local performing arts center getting new stage managers' consoles.
The stupid thing was that the que buttons were on a touch screen. So their was no non-visual feedback as to
wether it had been pressed or not. A stage manager has to keep their focus on the stage.
They went back to the old push button system.
This is just one example where the lack of kinaesthetic feedback makes touch screens a bad UI choice.
There are many more examples. Wherever one needs to operate a control without looking directly at that control
touch screens are a bad choice.

But that is exactly the point. watching someone move a mouse around is boring, if you want it to be interesting in a movie you need to exaggerate the gestures. Little things don't show well on screen, so they get made big.For film this is good. The problem isn't that these are bad movie interfaces, they're actually very good for movies. The problem only comes when someone watches the movie and then decides they should cripple the rest of the world with the same interface because it looks neat.

Is it any surprise that Hollywood gets UI wrong in favor of "looking good" when we have:* Bad physics (don't even get me started on the sound explosions make in space)* Bad understanding of current technology (every hacking movie ever -- with the very notable exception of The Social Network)* Bad history (based on a true story!)Etc etc.

Hollywood fundamentally wants to make something that "looks pretty" and to hell with practical applications -- because that pretty picture is ultimately what is being del

What was it that won the Oscar again?A smooth operation was not cinematic enough so history was thrown out the window. I dread to think of what they did with chunks of Zelazny's "Lord of Light" which was the cover script in reality, and got renamed "Argo" for some reason.

Minority Report's interface was not "terrible." It was really good, and so are most interfaces seen in movies.

Well, they're really good for doing what they're supposed to do.

What's the purpose of an interface? To provide a means to make what you want to do understood, and to provide feedback on the results of your actions or requests, and both of these things should be clean and unambiguous.

In a real-life interface, when you're trying to "ACCESS FILES" you move a tiny cursor with small hand gestures and then double click on a "Documents" folder that's next to a bunch of other folders, all labeled with small text fonts. Then you look past a bunch of unrelated files to find the one you might be looking for. Or type "ls" in a command line and a bunch of filenames scroll by. And if you need to enter a name and password, a small box appears for you, and when you get the password right, the box just disappears with no other information, or you get a small red line of text that says "wrong username or password."

This is effective for IRL computer systems, as it makes it easy for the user to unambiguously communicate what they're trying to do, and the results are obvious. In a movie, this is terrible. The director has a three second cut to the screen where the hero is trying to ACCESS SECRET FILES before the rogue agent comes back into his office. And you can hear his footsteps coming down the hall! And a cut to the door handle turning! A cut to the hero! And a cut to the screen! And in those brief cuts, you need to unambiguously tell the audience what's going on with the computer. "ACCESS SECRET FILES: ENTER PASSWORD." "ACCESS DENIED." "ENTER PASSWORD." "ACCESS GRANTED!" "COPYING SECRET FILES 15%.30%." Oh, and bonus points if the hero's face is reflected in the screen, because then the audience can see not only that he's trying to ACCESS SECRET FILES but also his intense expression, to build tension in a scene that's basically about pressing buttons on a computer.

So the interface in Minority Report was great. Cruise was doing something really boring: looking up files on a computer. Spielberg could have just plopped him down in front of Windows 2054 (it's a redress of Windows ME) and had him click on some icons, but instead we get to see exactly what he's doing with big, obvious gestures. "Looking at several videos! Picking these! Rejecting these! Zooming in on these! Marking that!" And all the while you got to see his face through the transparent glass screen. Cruise's actions are clear and unambiguous and his goal and the results are communicated well to the audience. That's a great "interface" between the director and the viewers.

Just saying, you don't pay Tom Cruise $20 million and then spend 2 minutes of your movie showing a mouse clicking around a screen.

"In 2013, surrounded by iOS and Android and Windows 8 devices, we use stripped down versions of this interface every day"

No we don't, iOS and the rest were never based on anything from Minority Report. The problem with a Minority Report type of floating interface is that you arms very quickly get fatigued. See an early 3D file system viewer..

Samsung's Alias 2 was a great phone. It had a twist-hinge in the middle that could be opened either like a clamshell phone (vertical, for talking) or horizontally as a text keyboard. And the buttons were e-ink, so they took different values depending on how you opened it. Since it was a clamshell, it couldn't butt-dial, and its keyboard was tactile.

But it was not "smart". It used BREW, Qualcomm's dumb-phone software. The screen wasn't touch-enabled and was small for a smartphone. Still, for someone doing more phoning than surfing, it would be better. Too bad they discontinued the line rather than do an Android version.

In the meantime I hang on to my old Sammy clamshell, since I use the phone for, uh, phoning and use my computers for email and the web. This way I use appropriate keyboards and no touch screen. Touch screen require good hand-eye coordination and as a fast touch-typist, I don't look at the keyboard, I feel the keys. Touch screens are just useless to me.

And as others noted above, touch screens in cars should be outlawed as an imminent hazard.