Advice sought on building a PC that can compete with next gen consoles (PS4, X720)
•
Page 3

RobTheBuilder wrote:
@vizzini so explain (briefly) how the ps4 architecture, which is based on existing PC tech will be affected by this. If its the same essential setup it will be similar at voxel performance and the PC specified will match it or more so.

Besides which, even if voxels and an appropriate architecture DO become standard, it will take at least 4-5 for the setups to become commonplace and mass market. The existing systems will still provide a great performance for a long time.

The PS4 has much more in common with the Xeon-phi co-processor boards than a modern day PC (like the one you are evangelising the OP to buy).

That's the point of the 8gb of GDDR5 in both PS4/Xeon-phi and the similar performance/flexibility and high memory bandwidth. One has double the memory bandwidth and the other has twice the teraflops performance at the cost being suited to gaming instead of a supercomputer, but both are similarly alien in performance setup compared to a PC, in spite of all three being x86 based.

Intel are clearly on a different page to Nvidia, who are on a slightly different page to AMD, who are on slightly different page to Sony, and Sony seem to agree on quite a few things with Intel. And all the main next-gen engines (about to release)are implying they are based on that voxel/contour setup from what I've read on the internet. All to avoid the ridiculous overdraw problems shown in the DF x3 titan PC running Crysis.

Maybe it will all go badly and voxels will fail again, but I wouldn't bet a £900 PC on it.

@vizzini ok, if all that is accurate I can see more of a case, but even so the follow through of that architecture to standard desktop level is likely to be 4-5 years, as PC developers are unlikely to risk their entire audience by developing a title with no scale back to the current architecture.

I think a PC at £800-900 is as much power you can buy for a reasonable price, and even if voxels become more widely used as you discuss, there will still be three-four years of very high level PC gaming from it.

RobTheBuilder wrote:
@vizzini ok, if all that is accurate I can see more of a case, but even so the follow through of that architecture to standard desktop level is likely to be 4-5 years, as PC developers are unlikely to risk their entire audience by developing a title with no scale back to the current architecture.

I think a PC at £800-900 is as much power you can buy for a reasonable price, and even if voxels become more widely used as you discuss, there will still be three-four years of very high level PC gaming from it.

But all PC developers target console specs, so multi platform games(UE4 especially if voxels) will favour the design setup of the PS4 in this scenario, so the PC version will be scaled back on the £900 PC (celeron i5, Gtx 680). So the eighth(edit) of the memory bandwidth, only twice the graphics fillrate to operate on a quarter of the voxel memory, and only twice the CPU performance to shuttle four times the data to and from RAM and VRAM. I still don't think it meets the criteria, even if it plays all the old UE3 games that you can play on 360 or PS3 slightly better than PS4 and Durango.

“Advice sought on building a PC that can compete with next gen consoles (PS4, X720)“

1. Voxels take off hugely and across all developers, which is unlikely to happen let alone in 2-3 years.

2. All publishers make lazy PC ports instead of retooling the games for PC architecture. Even if voxels did become the new benchmark, publishers aren't going to ignore the 90% of PC owners who are tooled for it, and performance will be comparable with PS4 just in a different way.

The odds of a i5 4ghz with a 670 or two being anything but a very solid gaming PC for a minimum of 4 years are so long that I don't think it's even worth discussing. Even if voxels do arrive, they won't be the standard until long after that.

I was as sceptical as you. But after reading that pdf in full on voxel/contours and how it removes the chronic overdraw bottleneck that is spectacularly displayed on the £5,000 3x titan DF PC running Crysis I'm now convinced.

I'll be surprised if this isn't how their engine toolchain is converting existing polygon mesh assets into presumably voxel equivalents, by augmenting a voxel mesh from each of the virtual camera viewpoints of the polygon mesh.

Voxels and point sprites have an extremely niche use and are extremely limited by not being able to animate them, meaning their only workable purpose is for destructible terrain and architecture. Polygon mesh will still be king for a long time to come due to it's lower performance impact and much higher graphical quality and texturing.

As for the PS4's architecture being anything different, it's irrelevant as all game design will be based on the same tools and x86 apis that PC games use.

Your point about animation isn't really that relevant, as most of the work done (and wasteful overdraw) in rendering is of assets that don't move. So rendering polygons for foreground animations is fine, as it allows for accelerating the ray-casting even more (by removing rays where opaque framebuffer pixels already exist). Using voxels for background animation would be efficient, either by having low fidelity voxel data for each animation frame streamed in from disc/disk, or updating the low fidelity representation in situ.

I had no idea you were refering to that, yeah, I read it a few years ago and watched pretty much everything about it become quickly obsolete. Efficient sparse voxel octrees and voxels aren't quite the same thing, but good work mangling your context. They're still pointless and will always be pointless. IIRC, they had a demo with the interior of a small building like a church and it was something along the lines of 30gb of data.

It's like that chump a couple of years back with Unlimited Detail that was trying to sell point sprites to the world, extolling all the virtues and completely ignoring all the drawbacks. Such as MASSIVE data requirements and, of course, no animation at all.

I don't know how you're so naive as to think that game development is going to change in such a radical way overnight just because you read some outdated nonsense about an experimental graphical technique.

Dirtbox wrote:
I had no idea you were refering to that, yeah, I read it a few years ago and watched pretty much everything about it become quickly obsolete. Efficient sparse voxel octrees and voxels aren't quite the same thing, but good work mangling your context. They're still pointless and will always be pointless. IIRC, they had a demo with the interior of a small building like a church and it was something along the lines of 30gb of data.

It's like that chump a couple of years back with Unlimited Detail that was trying to sell point sprites to the world, extolling all the virtues and completely ignoring all the drawbacks. Such as MASSIVE data requirements and, of course, no animation at all.

I don't know how you're so naive as to think that game development is going to change in such a radical way overnight just because you read some outdated nonsense about an experimental graphical technique.

So you haven't read it and are now claiming either reading incompetence of my comments or claiming to have read something you haven't? The conversion of polygon data to voxel/contour was 1.5bytes to 5bytes, eg 3.3 times the amount. So 2GB of gddr5 might become 6.67gb. Sound familiar?

Maybe you can shed some light on why nvidia's website piece on UE4 directs people to that document if it is all irrelevant as you wildly claim?

Dirtbox is right here. SVO's are essentially static for all intents and purposes. There is some work done on Animating SVO's though, but the only paper I've seen written on it was by a final year University student.

vizzini wrote:
Your point about animation isn't really that relevant, as most of the work done (and wasteful overdraw) in rendering is of assets that don't move. So rendering polygons for foreground animations is fine, as it allows for accelerating the ray-casting even more (by removing rays where opaque framebuffer pixels already exist). Using voxels for background animation would be efficient, either by having low fidelity voxel data for each animation frame streamed in from disc/disk, or updating the low fidelity representation in situ.

Either way, they aren't using it for modeling anything, vizzini. Nice work fundamentally misunderstanding it all though. Haven't you been flapping that pdf around for a couple of weeks now as if it's the future of graphical technology or something? lmao