There have been some great responses so far guys. I don't have time to read all the information right now. I wish I did but have somewhere to be. I updated the OP with all responses and questions that have been contributed. If I missed anything let me know and I will add it up there. This thread took off better than I expected and I want to thank all you guys for all the information and questions. Please keep them coming. If it isn't too much to ask is there anyway a mod can sticky this thread for future reference and updates? Thanks

Also I may be limited with internet this weekend so don't fret if I don't get your questions or answers posted right away. They will for sure be added Sunday evening when I return back to my apartment.

Well, get this. The UE4 demo I showed you guys? And the other games being demo'd?

Yea well.. According to some known devs at B3D that stuff was all done on the old devkits... With basically 1.5GB of usable memory.

Specifically regarding the UE4 tech, that was done rushed and unoptimized. A poster called "MikeR" said thus:

"Honestly you have no clue on what you are talking about. This demo was created/running on approximation hardware - not final dev/PS4 silicon. Also, the demo was created within a constrained timeframe for showing. Just for sh**s and giggles, the demo only used 27-29% of the AH resources - unoptimized. Before you ask, there is no link, I am the link."

Well, get this. The UE4 demo I showed you guys? And the other games being demo'd?

Yea well.. According to some known devs at B3D that stuff was all done on the old devkits... With basically 1.5GB of usable memory.

Specifically regarding the UE4 tech, that was done rushed and unoptimized. A poster called "MikeR" said thus:

"Honestly you have no clue on what you are talking about. This demo was created/running on approximation hardware - not final dev/PS4 silicon. Also, the demo was created within a constrained timeframe for showing. Just for sh**s and giggles, the demo only used 27-29% of the AH resources - unoptimized. Before you ask, there is no link, I am the link."

Well, get this. The UE4 demo I showed you guys? And the other games being demo'd?

Yea well.. According to some known devs at B3D that stuff was all done on the old devkits... With basically 1.5GB of usable memory.

Specifically regarding the UE4 tech, that was done rushed and unoptimized. A poster called "MikeR" said thus:

"Honestly you have no clue on what you are talking about. This demo was created/running on approximation hardware - not final dev/PS4 silicon. Also, the demo was created within a constrained timeframe for showing. Just for sh**s and giggles, the demo only used 27-29% of the AH resources - unoptimized. Before you ask, there is no link, I am the link."

Well, get this. The UE4 demo I showed you guys? And the other games being demo'd?

Yea well.. According to some known devs at B3D that stuff was all done on the old devkits... With basically 1.5GB of usable memory.

Specifically regarding the UE4 tech, that was done rushed and unoptimized. A poster called "MikeR" said thus:

"Honestly you have no clue on what you are talking about. This demo was created/running on approximation hardware - not final dev/PS4 silicon. Also, the demo was created within a constrained timeframe for showing. Just for sh**s and giggles, the demo only used 27-29% of the AH resources - unoptimized. Before you ask, there is no link, I am the link."

http://forum.beyond3d.com/showthread.php?t=63140&page=4

Whoa, if that's true then awesome.

I felt a little underwhelmed when I saw the comparsons to UE4 running on GTX680 but from every corner we are hearing how great the architecture and hardware is inside the PS4, including Epic's own Mark Rein. This makes me happy

Tech heads, is GDDR5 the next gen's cell? Some member seems to think it is. I'm sure it's way out of bounds.

I think because a lot of developers have really come out and backed Sony on it's architecture and picking 8gb gddr5 the other side of the fence believe it's just a fad and is being overhyped as something that promises big but won't deliver in a gaming sense. Something that people believe Cell was guilty of

Tech heads, is GDDR5 the next gen's cell? Some member seems to think it is. I'm sure it's way out of bounds.

GDDR5 is just reasonably cheap and fast memory. It's nothing new, since graphics cards have used it for years. It's a first for consoles, though. The point of having high-bandwidth memory like this is mainly to keep the GPU fed with data. GDDR5 in itself probably has little to no benefit for the CPU (DDR3 would likely be sufficient), but having it shared with the GPU could prove beneficial.

So, the memory in itself is just an enabler. Without fast hardware at the other end (i.e. CPU/GPU) it's not going to do any good.

GDDR5 is just reasonably cheap and fast memory. It's nothing new, since graphics cards have used it for years. It's a first for consoles, though. The point of having high-bandwidth memory like this is mainly to keep the GPU fed with data. GDDR5 in itself probably has little to no benefit for the CPU (DDR3 would likely be sufficient), but having it shared with the GPU could prove beneficial.

To be fair DDR3 would probably be a little better for CPU efficiency but I'm not worried about Jaguar on GDDR5, it looks to be more than capable of working it's way around any memory latency issues.

So from what I understand, the PS3 actually had a higher peak performance than the PS4 in terms of FLOPS. Is that true?

The CELL has a higher peak performance than this CPU from AMD, but not the system as a whole no. CELL really is a great bit of engineering, it was just too complicated and I suppose they couldn't make it more user friendly, or else I'm sure it would be back for PS4. The PS3 could have been so much more if the CELL wasn't relegated to babysitting the RSX 90% of the time.

The more you look at the PS4 specs and think about what Mark Cerny said, it's apparent that the CPU will probably be nothing more than the doorman and the Radeon chip will be everything, including handling usual CPU tasks.

Yeah Cell has twice the single precision floating point performance but this is a single metric to compare the performance of the CPU on. Cell can still out do the best x86 chips in certain situations, but these are rare and shrinking every day.

Jaguar is much more suitable for it's situation, now that the CPU isn't hand holding the GPU it doesn't need so much floating point performance and as such is a much more balanced and reasonable CPU as a whole.

Posting Permissions

PlayStation Universe

Copyright 2006-2014 7578768 Canada Inc. All Right Reserved.

Reproduction in whole or in part in any form or medium without express written
permission of Abstract Holdings International Ltd. prohibited.Use of this site is governed
by our Terms of Use and Privacy Policy.