Post Your Comment

29 Comments

We represent a very small fraction of PC gamers and users and even many of us are not sure we'll ever buy a PPU card. From someone who has repaired and built PCs over the years and has spent hours explaining the difference between "free" onboard graphics and 700.00 graphics cards I don't want to have to explain a PPU.

Look at the Geo mod capability in Red Faction (7 yrs ago) and the physics incorporated into the Source engine (3 yrs ago). They all came "free" because of the willingness of the game maker to make this work and keep the game playable on a reasonable PC. Look at Fracture out next year for the PPU-less 360 and PS3.

The quality of popular PC franchises are already being compromised for consoles so there is little hope that a PC specific PPU will ever be required to play any game. I think AGEIA's best hope for the PC is middleware and having their technology integrated into main boards or graphic cards.

The CPU is dead! The future is dicrete processors.
The heat wall is a not a bump in the road, it IS a wall.
Look at chipzilla that had a concussion rammin it's head in the wall, spent 4-5 years redesigning then pentium m to core 2 duo.
AMD bought a gpu-manufacturer that still has a few years of developing before the heat wall and Intel bought Havok and is quietly gearing up it's own dicrete gpu-dept.
Nvidia seems to be working on worlddomination on their own, or just trying to boost their own value before being bought by chipzilla.
Sure you can have maybe 16 cores (heatwall + power envelope) on a CPU but still they are slow and they dont have their own superfast dedicated ram.
So Ageia is right on the mark, how far they now seem to be from it. Reply

give me all the physics you can on a 2 or 4 core cpu and go on from there. a phyx card does nothing during normal use. its dead weight. i'd rather pay for a nicer cpu or gpu which are used in some way 100% of the time. there just isn't a killer game play app for the ppu. stacking boxes is already possible and not that compelling. developing a ppu only killer app game isn't really something i can see happening. while they wait for that miracle to come by cpus will creep up and surpass the card anyways:P Reply

Games developers will only take PPUs seriously once a lot of people have them, and this will only be the case once Nvidia and ATI start shipping graphics cards with PPUs on-board.

This is not too far off, Nvidia and ATI are both known to be working on including physics on their next generation of cards.

Buying a bodge-job like this, a PCI expansion card which actually offers very little in the way of power and only really affects one mainstream game in a very minor way is a total waste of money right now. It's far more sensible to upgrade your CPU and wait for physics-enabled graphics cards to come out.

On top of all this, I'm fairly certain exactly the same results could have been acheived by fully utilising all four cores of the CPU, but the Ageia modders chose not to, for obvious reasons. Reply

Games developers will only take PPUs seriously once a lot of people have them, and this will only be the case once Nvidia and ATI start shipping graphics cards with PPUs on-board.

This is not too far off, Nvidia and ATI are both known to be working on including physics on their next generation of cards.

Buying a bodge-job like this, a PCI expansion card which actually offers very little in the way of power and only really affects one mainstream game in a very minor way is a total waste of money right now. It's far more sensible to upgrade your CPU and wait for physics-enabled graphics cards to come out.

On top of all this, I'm fairly certain exactly the same results could have been acheived by fully utilising all four cores of the CPU, but the Ageia modders chose not to, for obvious reasons. Reply

While performance didn't increase that much with the PPU on the real maps, did the game look better? You didn't answer this. If I get noticeable better effects with the PPU and the same performance as without, thats a win to me. If it just displays the same effects as without the PPU just faster, then no its not worth it.

Physics is not about increasing the FPS of a game. It's about making it more realistic and immersive. Reply

For a low end or old system, an inexpensive card which could offer a performance boost would have been welcome. However these tests show more of a performance gain on the quad core than the dual core. While this might mean that people with money to spare (high end quad core users) may be able to get even more performance for relatively little (compared to their CPU's price), low end buyers can't get as much of a boost.

Which sounds to me backwards. If I could get a significant performance boost for my semi-old PC, I might have been more interested. Reply

That is because a lowbudged game rig has several bottlenecks and if this is GPU holding back FPS, ofloading a not so bottlenck like CPU yields not much in such case.
As most game tend to be GPU limited. And a old or low budged PC wil often be most limited on GPU.
A PPU isn't ment for offloading but for more Physics. This means a decent game-rig is needed to use a PPU properly with a physics load made for PPU.

A PPU is more ment for Physics enthausiast, early adopter, hardcore gamers or highbudged rigs. To play PhysX games in optimal settings.

The QC yield so much because the benched it with a less GPU stressing setting. Wich is not the way people game with a high-end game PC. Reply

If most of the games arriving are enabled with this technology at a minimum it would give an equivalent boost of going from dual to quad core on regular maps using 2nd order physics. For games/maps using 1st order physics it would actually allow you to play them. For a $100 part (I have seen priced at $149) that might make sense if you have room in the case and PCIe 1x cards become available as a working product.
For most of us, you know, people that build/buy a PC under a budget, this may or may not make sense. A physics processor may fall into the category of sound card since Vista and Realtek has taken great strides in making the previously mandatory sound card irrelevant. They might provide a couple "nice to have" features and slightly improve performance, but that money might be better spent on Crossfire or a faster CPU or better RAM or a mirrored hard disk.
The physics card is going to have to earn a place and this article does support that ability.
Heck, if it allows players to chew up the scenery I am all for it! Reply

nVidia's upcoming dual-function next-gen silicon fully supporting either GPU or GPGPU functionality will finally kill PhysX. Wonder why the 780i chipset supports 3 PCIe16 slots and PCIe2.0? Think 2 GPUs in SLI dedicated to graphics and one GPU functioning as a GPGPU for bleeding-edge gaming. And for the less than bleeding-edge gaming, quad-core computation of bulk physics in combo with spare GPU horsepower for particle-effects will do quite nicely. Reply

The funny thing is that at this time last year I'd agree with you, but now I'm not so sure.

GPU physics has been a bust so far. ATI/NVIDIA made a bunch of noise when AGEIA announced their PPU, but HavokFX never came through and there is no other off-the-shelf solution that can do second-order physics, never mind first-order physics. It always seems like GPU physics in games is just around the corner and it never happens, and I'm not convinced that this misdirection from the GPU manufacturers is unintentional.

I keep hearing conflicting stories about when we'll get real GPU physics. Now the story is that things will finally be put in order forcefully by Microsoft in DirectX11, which is still years away (MS is still working on delivering DX10.1). This would coincide with some improvements in GPU threading that would help GPUs deal with the split workload, so it's very possible this is the case. Then again, both ATI and NVIDIA are finally delivering mature versions of their GPGPU programing environments (CUDA/CTM) which would allow someone to write a physics package at a lower level than shaders. I'm not aware of anyone working on this for games, though.

In the mean time the PPU does exist, and we expect we'll be seeing a far more powerful version fairly soon.

The 780i isn't the only chipset that supports 3 PCIe x16 slots either. I can show you some boards that came through here in 2005 that had the same support, for the exact same reason. Furthermore NVIDIA has a poor track record of supporting >2 GPUs in a system at once anyhow. Reply

So we don't need another Hardware requirement on the Box? In the form of a PPU that is $99 now and that as shown that can be faster them a $400+ dual core right? But you are not against the use of 3 or more GPU's and a new board if you want physics hardware acceleration, just do the math and see who gives us the best bung for ours money. Reply

I'm not saying their isn't a performance benefit. But the fact is PC gaming has slowly been taking the backseat to consoles over many years, mostly because it's getting more complicated and expensive for the average user. It's already depressing enough for most people to buy a high end graphics card only to have it chug on new games within years time. What's going to happen when they need a PPU upgrade every year also? Another $100? More? That's probably for the cheap ones. No doubt high end PPU cards will be end up costing hundreds. Reply

No, I think the problem is, pc gaming and well gaming in itself hasn't offered much innovation in a long time. A PPU would add a great amount of depth that is so sorely needed in today's games. What's the point of a high resolution tree if all you can do is cut it down in the same way as any other tree? Or how about driving a beautifully rendered car, but when you crash it, the damage is the same EVERYTIME. Reply

On a unrelated note, does anyone know if the retail game shipped includes high resolution textures? People were complaining that the demo didn't include it, and the in-game graphics don't compare with anything like those eye-popping screenshots Epic teased us with. Reply

i hear u man. i have 1 soundcard and 1 wifi card and my other pci slot is blocked by the aftermarket cooler on my x1900xt, so there's simply no room for this card on pci even if i wanted to buy it.:p Reply

I'm sure there are others on the market as well.. I personally hate wifi and avoid it as much as possible. If I can get a wired connection, I'll use that first, even if it means running a line through the walls 50FT+ in order to get it. Reply

That's choice there are more add-on's then ATX provide slot's for.
This is not Ageia's problem. But more a choice the consumer must make.
Well I have a spare PCI slot next to my PPU. I realy don't have much in it a G-card and a PPU.

If for example choose for triple SLI you made a choice to blow away 3 to 9 slots.

Or must have a prof.RaidContoller and TV-card next to SLI. Then it get crowded with a soundcard..

And it not long ago that the populair sound card manufacturer offer a PCI-E sound card.

I believe there is a PCIe x1 version of the PhysX card sold to OEMs. There was a story somewhere that said AGEIA was going to release the PhysX PCIe x1 card into retail, but decided not to when it turned out that it wouldn't work on a significant portion of the motherbaords with PCIe x1 slots, because the slots weren't implemented in accordance to PCIe specs. Something about out of spec jitter was mentioned.

There is something Anandtech has been missing from their motherboard reviews: Do the PCIe x1 slots actually work? Reply

In Today's multi core CPU, unless you need to do physics with say 16 different threads, you would in theory ease 1 core out of all.

Say the physics processor can do 1/2 of a CPU core's work (unlikely), then on a single core you gain 50% performance, dual core you gain 1/4, quad core 1/8.

Compare to how much of that work in theory could be done on a GPU (it's just math), and say it has the performance of 4 of the 48 shader's performance, you get 1/12 of a gain in GPU.

Now in practice, splitting work across a slow bus like PCIe, PCI, or HT are going to slow it down even more, with additional I/O work on both ends you are going to gain almost nothing (exactly where it is now).

I think if the market gets big enough, Nvidia or AMD will put one in their GPU, Intel/AMD will 1up each other by putting it in instead of wasting all the die space for cache. A dedicated physics processor on a bus as a card will never make it big.
Reply

Your logic is completely flawed, since you do not know anything about the varying workloads, how good each type of chip is at each workload and how the workloads compare relatively.

Specifically at physics calculation workloads, a PPU chip is a multitude of times faster than a cpu core. (Note the use of chip and core, since a ppu is more a collection of minicores). What this means is that when you peg one core of your multicore CPU at 100% load with solely physics threads, it is still way slower than a ppu, and thus can not handle the same physics workloads.

The actual performance boost is thus completely dependent on how big a workload the physics part of the application is. And current games do not have a big physics workload, partly because without a ppu the processing power is just not there. Reply