Intel is no stranger to raytracing - we've seen demonstrations such as Quake IV ported to an Intel-designed raytracer along with a number of other demos in the past. The promise of raytrace renderers over today's more conventional raster engines for games and desktop 3D has always been increased realism and theoretically near linear scaling. Of course, the problem until now has been that raytracers haven't been able to maintain playable framerate at desktop resolutions.

Yesterday Intel demonstrated a new example of raytraced graphics on the desktop using a raytrace rendered version of Wolfenstein. This time, it was based around a cloud-centric model where frames are rendered on 4 servers, each with a 32-core codename Knight's Ferry silicon at the core.

Knight's Ferry is a Many Integrated Core (MIC) architecture part intel showed off at the International Supercomputer Conference this year with 32 logical cores.

We saw the Wolfenstein demo raytraced at 1280x720 and averaging between 40 to 50 FPS, all being rendered on four servers with Knights Ferry at the core. Intel showed off the game visualized on a Dell notebook. All of the frames are sent to the thin client notebook over ordinary gigabit ethernet.

Interesting visual features of the Wolfenstein raytracer include physics-correct refractions and reflections at interfaces like glass and water - taking into account the actual index of the material. and very recursive examples like a surveillance station camera with the surveillance station inside it. Check out the gallery for those examples, screenshots, and more.

I'm not too impressed with the visuals, actually. The refraction is nice but they shadows are completely broken in most of the images. It's probably a compromise for performance since most modern ray tracers have no problem with this... it just takes them a while to get it done.

It's an interesting demo, but it doesn't change the fact that ray-tracing is too computationally intensive for real-time stuff. The same amount of money in traditional rasterization hardware would buy 100x the performance.Reply

This technology has been "budding" for 10+ years, and has always been 8+ years behind rasterization in image quality, and they're getting further behind as time goes on. If you ignore refraction and reflection, real-time ray tracing has been at DirectX 6 levels of image quality for several years, now.

This is a fundamental limitation of raytracing... the math to do it is much more complex than rasterization, which demands more hardware to deliver each real-time pixel. Intel canceled the Larrabee GPU because they knew it was a lost cause.

To say "it's no crysis" kinda proves the point: that game came out in 2007 and here it is 2010 and ray tracing is nowhere close.Reply

Well when ray tracing overtakes rasterization I'll be looking for your post with a picture of your foot in your mouth, though I'm sure you won't be there since you'll be too busy on newegg picking up the hardware.

nVidia has demonstrated promising, HIGH-QUALITY ray-tracing demos for the GTX 400 series that aren't far off from game-capable.

Your negative comments are not productive. This article demonstrates not only an improvement in ray-tracing, but also in architecture with this being performed in a cloud environment. This means that when many-cores come down in price and scale up more, it's very possible to see ray-tracing in high-end machines, but you wouldn't know that because you're too busy telling everyone how stupid of an idea ray-tracing is versus rasterization because of some "fundamental rule" that you can see, but obviously Intel, nVidia, and ATi/AMD cannot. They've only been doing this for years, but you obviously know more than them.

You should quickly write them a letter telling them how they are wasting thousands of dollars demonstrating ray-tracing on games since it will ray-tracing will always be behind rasterization. Be sure to write a letter to SSD manufacturers and quantum computing researches as well. Hard drives and transistors have had too far a head start they should just give up now.

I'm not just some poser on the internet. I've written ray tracers. It's pretty easy. I thought this demo was a clever way to show the same effect three different ways.

Reflections, refraction, and sub-cameras are all the same effect: after the ray hits the surface, a calculation is performed to shoot _another_ ray to determine what the object on the other side is supposed to be... it's simply a different calculation for reflection/refraction/picture-in-picture. Ray tracing does this very accurately since it calculates each pixel individually, but this is also precisely why its performance is so bad... GPU rasterization can process whole swaths of pixels in one pass with circuitry and caches aligned to do precisely that (and only that).

Likewise, lighting is calculated by shooting another ray from an object intersection point against all the lights in the scene... if it doesn't hit any, that pixel is part of a shadow. This yields mathematically perfect shadows, but the extra ray causes a massive speed penalty that gets worse as you add lights to a scene (which is why the lighting is so primitive in every real time ray tracing demo I've ever seen).

The fundamental performance problem with ray tracing is the rays themselves. They can go in wildly different directions on a per-pixel basis, generating a variable number of additional rays depending on what they hit and the number of light sources. You're forced to limit the use of these effects to preserve your speed, which raises the question of why you even bothered with ray tracing in the first place. It's a tease... ray tracing can do wonderful per-pixel effects but has no way to do it fast.Reply

You have to remember this is a game that was never intended with ray tracing in mind. If a game was made from the ground up to be ray traced, it could very very easily look far better than anything else that exists these days. Being as you've use ray tracing you would know how easy it would be to do this. The ability to make it look better than anything else is already there, just no one has done it and the performance hit would be too demanding.Reply

Replying to your general thread, I completely agree; I look at these screenshots and, to me, it looks like pre-rendered imagery from a mid-90s game (no, that's not a compliment). The lighting is disturbingly uniform (did they disable most lightsources and just go for global illumination?) which is probably the primary culprit.

I don't think the idea that ray tracing never WILL catch up is still not sinking in to people. Think about this, then: rasterization can be done faster than ray tracing. You might say "But in 5 years, hardware will be fast enough to do this at high quality in real time!"

To this, I respond, "Yeah, and imagine what rasterization will be able to do on the same hardware."Reply