Signup for the Newsletter

Our pick of Siggraph 2013’s emerging technologies

The University of Electro-Communications’ weird and wonderful AquaTop display turns a water surface into a touchscreen interface: just one of the highlights of Siggraph 2013’s Emerging Technologies exhibit.

ACM Siggraph has posted a list of the research on show in the Emerging Technologies exhibit at Siggraph 2013, due to take place in Anaheim from 21-25 July.

As with the technical papers (we posted a round-up of the highlights last week), the list is a fascinating mixture of computer graphics’ Next Big Things, and things that will only be big in your worst nightmares.

New advances in display technology
A recurrent theme this year is new display technologies for games and moving images: the subject of no less than four very different exhibits.

Probably the cutest – if not exactly the most practical – is the AquaTop, a system developed by researchers at Tokyo’s University of Electro-Communications that turns a water tank into a giant touchscreen display.

The video at the top of the story shows users moving images around as if they were using an ordinary Windows tablet, “submerging” icons to delete them, and even “dragging and dropping” a character in cupped hands.

The end of the video shows a simple game created using the AquaTop as an input device – but unless a lot more people than we think play videogames in the bath, we’re not sure it’s really going to catch on.

More practical, if less fun, is Microsoft Research’s Foveated 3D Display (above), which exploits the nature of human vision to display a realistic-looking on-screen image for less processing cost.

The system mimics the way in which the sensitivity of the human retina falls off outside the fovea, the tiny central region responsible for the sharpest part of our field of view.

The display renders an area corresponding to the fovea at high resolution, its immediate surroundings at medium resolution, and peripheral parts of the image at low resolution.

The three views are blended together seamlessly through a range of anti-aliasing and jittering techniques.

Eye tracking is used to move the foveal layer around the screen so that it remains at the centre of the user’s field of view – meaning that, to the user, the display appears perfectly normal.

You can see a video online here, but being Microsoft, it requires Silverlight, and won’t embed in this story.

Also straight out of Microsoft Research is the IllumiRoom: Peripheral Projected Illusions for Interactive Experiences, which extends console gameplay outside your TV set.

The system tracks the geometry of surrounding objects in the room, then projects images onto them – and can be used to do anything from simply expanding the field of display to heightening gameplay.

The video shows the system in use to make grenades roll out of the screen – fun, if a bit gimmicky – or to project abstract patterns that heighten the viewer’s sense of speed or immersion.

But the effect we can really see a lot of developers using is that the one that makes the room seem to distort in time to bullet impacts – effective, but possibly a recipe for a migraine if you play for too long.

Finally, from the nightmare department, is the University of Tokyo’s Incendiary Reflection: Evoking Emotion Through Deformed Facial Feedback, which displays your face in real time with new facial expressions.

It’s ostensibly based on medical research that shows that smiling actually makes people happier, not just the other way round – but we know that it’s really a schizophrenia simulator.