Euclideon's voxel point clouds are rather pretty

Could the next Elder Scrolls game you play look like the screenshot below? Euclideon is working to make that a reality with their new voxel engine. The engine is strictly CPU based, similar to the long dead Larrabee architecture but with one major difference, currently they are capable of rendering 2000x1000 frames at around 32 FPS on a six-core processor. They are properly referred to as frames because this is a point cloud solution, not pixel based. They generated the images in the video you can see at The Tech Report by rendering 3D scans of real objects and locations but programmers will still be able to create scenes with Maya or 3ds Max. Euclideon feels that they can still get a lot more performance out of a CPU with software refinements and are not planning on moving to GPU at this time. With two unannounced games using this new engine in development it might be time to make sure your machine has at least 6 cores so that you can be ready for their launch

"We first heard about Euclideon back in 2011, when the company posted a video of a voxel-based rendering engine designed to enable environments with unlimited detail. This month, the firm made headlines again with a new video showing the latest iteration of is technology, which uses 3D scanners to capture real-world environments as point-cloud data. We spoke to Euclideon CEO Bruce Dell to find out more about these innovations—and about the first games based on them."

Video News

I saw the video with the 3D scanners and it was the fishiest of the fishy smelling presentation vidoes I've ever seen.
It shows a series of good looking, but still very obviously computer rendered scenes while talking about how 'real world footage like this' is still unrivaled by CGI and then reveals the twist expected by everyone with half decent vision by telling the viewer that the scenes were NOT filmed footage.
They then claim they can do animation and advanced lighting without showing any examples of that and tell the viewer that they will do them the great honor of considering their investment proposals.

We collect a lot of LIDAR data at work and there is a limit to how much you can collect. At the collection level you are limited to scanner resolution which is usually defined by meter, cm, or mm resolution. The smaller the measurement, the more storage is required. At cm-resolution, you would need TBs than every game on Steam combined for one rendered scene WITHOUT textures or colors. The presentation of those data is also limited by processing power - the finer the resolution, the worse it gets.

To make a playable game will require incredible compression and engine sophistication (and processing power). Maybe we'll be there by 2016... Broadwell-E and Volta?

Euclidion can go sit with Every Other Sparse Voxel Octree Engine So Far right up until they can show a demo with non-baked lighting (e.g. not a scan of a lit physical model) and woth voxel animation, and do so without hilarious storage and memory requirements. These have been the stumbling blocks of voxel engines in the past few decades, and so far all Euclideon have shown is "Me too!" technology, not new techniques.

So these Point cloud voxels are just large data sets of values that can be panned through in 3D, and the CPU calculates and rasterizes this into images based on the point clouds position in the viewing plane. Great for background scenes that are not modified in real time, but non-deterministic actions, or the same non-deterministic actions that result in changes to game objects, will still have to be modeled on the GPU, Including effects that may modify surrounding scenery, Like Explosions or, bullet holes, RPG rounds going off, etc.

This whole thing sounds like a work in progress, looking for further funding, but does show some promise, if the point clouds can be made to simulate clay, and be formed without the restriction of edge topology getting in the way, for virtual modeling that can have the relative outer surface of a point cloud defined object turned into a accurate 3d mesh, or for scanning real clay models, or actual real life scenes, and using them as background non-interacting imagery. The video presentation did little to educate, and more to wow potential investors.

The CPU makers, must love this, just as hardware ray tracing is starting to appear on GPUs, but gaming will always require some from on massively parallel computational abilities, that current CPUs can not handle. Interpolating point cloud data directly into rasterized images still looks artificial, and the only true way of presenting reality is with the rays of light calculated by their quantum interaction with real atoms, and between the light rays themselves, but the best approximations of reality will be through ray tracing algorithms done on GPUs with ray tracing hardware built into the Each GPU core. CPU cores are still a little pricy when needed by the thousands, compared to GPU cores. And Graphics software already deals with particles, so what is the difference?

Are games going back to having datasets with loads of data to be scrolled through, or will the point clouds be modifiable in real time on just a CPU, without the help of the GPU's massively parallel processing ability. This technology may find a use in augmenting the GPU for gaming, but I do not see in replacing the GPU, and how point cloud data can be made to replace ray tracing, is a big question. This technology appears to be more for automating many of the production aspects of game creation, but not for replacing real time random interaction.

All this talk of voxels makes me think back to Delta Force 2 and Delta Force: Land Warrior. At the time it was one of the only games that didn't just use flat textures and sprites to model things like grass and foliage. The volumetric nature of voxel rendering made for great large scale terrain modeling and the relative sense of space and dimension, but it always had a distinct lack of fine detail up close and aliasing effects since each voxel had to be rather large. Now that I think about it, in a way voxels are a bit like how Minecraft models everything with blocks and large pixels.

Those old games look positively primitive compared to the current efforts. Should be interesting to see if this can be adapted to GPGPU/HSA hardware acceleration and the pro/con compared to traditional pixel-based rendering.

There is some interesting tech here, but it is pretty obvious that they are using baked lighting, which makes the rendering really easy in a technical sense, seeing as most of the special work in games now days is used making the lighting more realistic. If you had a scene similar to these running in a current game engine with baked static lighting it would likely perform quite well. I should also note that ProcWorld has a post about this as well that has some really interesting comments.

Every 2 years or so these guys show up to excite investors, or SOMETHING to that effect. I'm not familiar with how graphics are actually made on-screen, but I believe John Carmack when he was asked about the technology and he said "Nope". Thats an authority I'm happy to lean on.

Voxels were the hottest thing when Delta Force (was that the name?) came out and it looked crap and ran like crap.