Making real-life portals with a Kinect

[radicade] wanted to know what real life portals would look like; not something out of a game, but actual blue and orange portals on his living room wall. Short of building a portal gun, the only option available to [radicade] was simulating a pair of portals with a Kinect and a projector.

One of the more interesting properties of portals is the ability to see through to the other side – you can look through the blue portal and see the world from the orange portal’s vantage point. [radicade] simulated the perspective of a portal using the head-tracking capabilities of a Kinect.

The Kinect grabs the depth map of a room, and calculates what peering through a portal would look like. This virtual scene is projected onto a wall behind the Kinect, creating the illusion of real-life orange and blue portals.

We’ve seen this kind of pseudo-3D, head tracking display before (1, 2), so it’s no surprise the 3D illusion of portals would carry over to a projected 3D display. You can check out [radicade]’s portal demo video after the break.

10 thoughts on “Making real-life portals with a Kinect”

Right? “OMG: You’re showing the output of a webcam on your computer screen with blue and orange circles around them!” — Like when they said projector, I was thinking it was aimed at actual “portals” in the room somewhere. This is kind of “meh” at best.

Oh…dear.
@tehgringe There is trolling, then there is giving a frank honest opinion of what you think, dont get the two confused. This sucks donkey balls. But not to fully offend, i will at least offer reasons *why* it sucks.
It’s slow to update, it looks ZERO like the portal style save for the rings, the resolution of the ‘mirror images’ is too low and they appear to go beyond the rings, etc etc etc.
Lets face it, like many who post cool looking viral digital trickery, they’re simply attention whores, possibly out for ad clicking revenue, etc.

I think this is still at the proof-of-concept level. If you look at his blog post, you’ll see why the performance is so bad.

I hope he improves it. When he does, you’ll see all the stupid trolls go from “looks like shit” to “mind = blown!”.

Even so, I can’t help but laugh at how all this easily implemented stuff probably goes over the trolls’ heads; I can’t believe how people, in comparison, are so nonplussed when someone makes a bootloader for an AVR or PIC or writes a freaking Linux kernel module!

Its a good start – I dont think people get how incredibly hard this is to do.
Your dynamicly trying to rendering two different room angles in realtime from one location with correct paralax etc. You couldnt do that with a webcam input at all (unless it was on a moving robotic arm).

I could suggest visual improvements: making a high quality static mesh of the room for rendering, then apply any realtime kinect data ontop (so you could still see any moving elements correctly, yet the room overall looks better quality).

Getting the framerate up could be hard though – I dont think the kinect is that fast.

No one critising even has the vaguest concept of how hard this is.
I have seen many portal projects – none allow dynamicly placed portals and none allow you too see the correct image from anything but headon.
Except this.

Once he can speed up the refresh rate the only thing to worry about will be to make sure he doesn’t have too much to drink — after knocking back a few I’d probably think I could jump through the portals…