Just attended an IDGA presentation on next-gen holographic display technology. Holy crap! Based on what I just saw, we could conceivably have Star Trek "holodeck" type displays within a decade or two. The basic tech already exists as a proof-of-concept prototype... it just needs to be scaled up by a few orders of magnitude. (Not unexpectedly, there's a mind-bogglingly huge amount of bandwidth and compute power involved.)

I had not been keeping up with recent developments in holography. This was a real eye-opener.

The years just pass like trains. I wave, but they don't slow down.-- Steven Wilson

I kind of remember talk about that stuff from my school days, but never saw anything demonstrating plausible ways for it to work. I suppose I'll have to waste time (like I do all too often on this site) looking around for info about it.

The one they talked about today didn't have a spinning mirror up in the area where the image is displayed. It projects the image into the empty space above a pedestal containing the rendering hardware and optics; the image appears to float in mid-air. Input is a normal OpenGL display list which gets rendered into a hologram in real-time by a custom supercomputer built into the pedestal; multiple pedestals can be pushed together (tiled) to get a larger display area. They didn't have the actual unit here for us to look at (it is still just a lab prototype at this point), but claimed that scaling it up to wall size is "just a matter of money".

They did have some pretty impressive static holograms on display at the talk. I was not aware that you could have large (I guesstimate the usable virtual image volume to be about a cubic meter) full-color holograms which are illuminated with nothing more than a single overhead halogen bulb. (I thought you still needed lasers to do stuff like that...)

The years just pass like trains. I wave, but they don't slow down.-- Steven Wilson

What is the light projected off of? Do you have any links or information about the actual mechanics or physics involved in these displays?

Yeah, I'm curious how they figured out how to "freeze" light in mid-air. Besides the spinning mirror one I've seen other attempts with stuff like mist or dust, none of which seems particularly promising to me.

The guy giving the presentation wouldn't say what the underlying mechanism in the real-time display was. I can give a little more detail on the static ones though (since he was willing to talk about those and there were several examples on display), and can speculate on what they might be doing in the real-time one.

The static ones weren't projected "off of" anything. They looked like sheets of flat plastic a few mm thick, which were (as I noted before) lit from above with a normal halogen incandescent bulb. The illusion of objects floating in the air above it (and in the floor below it as well) was rather eerie. This was a full 360-degree view; you could walk around it or spin it (it was mounted on a turntable), to see it from all sides. One of the examples showed two different things depending on which hemisphere it was viewed from.

There is a viewing angle limitation -- your eyes need to be somewhere in the 90 degree cone coming out of the center of the "image"; if you look at it from an oblique angle (more than 45 degrees off-axis), it just looks like a sheet of flat black plastic.

They have the ability to generate these things from almost any input source -- CAD models, 3D scans of physical objects, or even virtual 3D reconstructions of physical objects created from a series of still shots or video of a physical object taken from multiple angles. They've apparently been using these things to provide 3D maps of 'areas of interest" in Iraq and Afghanistan to the military, but the price is coming down to the point where commercial/consumer applications are now becoming feasible as well.

It was actually a little creepy -- if you try to touch the floating image, you get this vague sense of "wrongness" because your brain says there should be something there, but (obviously) your hand goes right through it.

I'll try to summarize the tech as I understand it; the slides from the presentation haven't been made available for download yet, so this is from memory.

Like any hologram, the goal is to create virtual "wavefronts" of light, which arrive at your eye as if they had emanated from a particular point in space. "Classic" holograms required a physical object which was scanned with a laser, creating interference patterns in photographic film; when later illuminated with a laser and viewed, these interference patterns re-create the virtual wavefronts, tricking your eye into seeing the original 3D object. Unlike the binocular 3D tech used in 3D movies, etc. you're not just using binocular parallax to apply the illusion of depth to a 2D image; you're simulating the light being reflected off of a physical object, in all directions (well, in this case at least all directions within a 90 degree cone...) at once.

The "magic sauce" these guys seem to have is the ability to generate a color hologram from pretty much *any* 3D representation of an object, real or virtual. If I followed the talk correctly, what they're basically doing is creating over a million tiny (0.7mm by 0.7mm) color holograms in the plastic sheet (they call these "hogels", for "holographic pixels"). Each hogel, when viewed from any angle within the 90 degree viewing cone, uses the above mentioned optical interference effects to reflect precisely the color and intensity of light that you would see if viewing the object from that angle. Creating these holograms is an extremely computationally intensive task, since each hogel must be rendered from all possible viewing angles; I have no idea what the angular resolution is, but whatever it was there was no visible jerkiness when the hologram was rotated -- it was very smooth. But the *really* tricky part is exposing the film -- IIRC it was stated that the distance between the head that does the "writing" of the hogels and the film must be controlled to sub-micrometer precision or the interference patterns are destroyed.

For the real-time display each hogel would need to be able to emit a 90 degree cone of light, with the color and intensity modulated based on direction of emission. Off the top of my head, the only way I can think of doing this would be via some combination of next-gen DLP imaging element (micromirrors) that allows fine control over direction of reflection in 2 dimensions, lasers, and rotating mirrors. IIRC it was stated that each *frame* of the real-time display represents approximately 1.5TB of data, and that they've achieved frame rates of 15Hz... that's a frikkin' *lot* of data!