Lmao grasping for straws can't handle truth. He got his info from MS tech engineers. Its funny how things work on this forum whatever Sony says it is true but when Microsoft explains the tech and specs behind the x1 its all lies.

ERP on B3D (Mod and former first party Sony dev) said that you have a performance decrease per CU the more you have.

Also from ERP:

FWIW my expectation is that PS4 ought to have a performance advantage, but I wouldn't expect it to reflect the difference in CU counts. CU's are MASSIVELY underutilized on vertex heavy workloads and plenty of the frame will be ROP or bandwidth limited.

There are just too many little things that can significantly impact performance, there were times with early firmwares on PS4 where seemingly innocuous changes would affect performance by as much as 10%.

The eSRAM will certainly provide an advantage under some circumstances, and I'm interested if the ROP difference will end up being a factor, or the lower eSRAM latency will end up nullifying it.

For that matter I could imagine the graphics API having a significant effect, the CPU single threaded performance isn't great on these machines and a poorly conceived implementation, could hurt games across the board. Sony having "lower level access" here isn't necessarily a win.

Hell I could imagine cases where CPU limited games demonstrate an advantage on XB1.

Sent from my SAMSUNG-SGH-I337 using Tapatalk 2

"We don't provide the 'easy to program for' console that (developers) want, because 'easy to program for' means that anybody will be able to take advantage of pretty much what the hardware can do, so then the question is, what do you do for the rest of the nine-and-a-half years?"
--Kaz Hirai, CEO, Sony Computer Entertainment

ERP on B3D (Mod and former first party Sony dev) said that you have a performance decrease per CU the more you have.

Also from ERP:

FWIW my expectation is that PS4 ought to have a performance advantage, but I wouldn't expect it to reflect the difference in CU counts. CU's are MASSIVELY underutilized on vertex heavy workloads and plenty of the frame will be ROP or bandwidth limited.

There are just too many little things that can significantly impact performance, there were times with early firmwares on PS4 where seemingly innocuous changes would affect performance by as much as 10%.

The eSRAM will certainly provide an advantage under some circumstances, and I'm interested if the ROP difference will end up being a factor, or the lower eSRAM latency will end up nullifying it.

For that matter I could imagine the graphics API having a significant effect, the CPU single threaded performance isn't great on these machines and a poorly conceived implementation, could hurt games across the board. Sony having "lower level access" here isn't necessarily a win.

Hell I could imagine cases where CPU limited games demonstrate an advantage on XB1.

Sent from my SAMSUNG-SGH-I337 using Tapatalk 2

You know what, I'm going to pay that as perhaps one of the most level headed posts I've read from you X2. You actually do technically have a point about the cpu speeds. With the current 1.6 Ghz on the ps4 and the higher rating on the X1, it does have a slight advantage there. This only holds true until we get confirmations on what ever the ps4 is doing though.

Arguing about single thread performace is kind of a moot point for the next gen, for either platform. Everything is going to have to be multicore to have decent performance.

ERP on B3D (Mod and former first party Sony dev) said that you have a performance decrease per CU the more you have.

Also from ERP:

FWIW my expectation is that PS4 ought to have a performance advantage, but I wouldn't expect it to reflect the difference in CU counts. CU's are MASSIVELY underutilized on vertex heavy workloads and plenty of the frame will be ROP or bandwidth limited.

There are just too many little things that can significantly impact performance, there were times with early firmwares on PS4 where seemingly innocuous changes would affect performance by as much as 10%.

The eSRAM will certainly provide an advantage under some circumstances, and I'm interested if the ROP difference will end up being a factor, or the lower eSRAM latency will end up nullifying it.

For that matter I could imagine the graphics API having a significant effect, the CPU single threaded performance isn't great on these machines and a poorly conceived implementation, could hurt games across the board. Sony having "lower level access" here isn't necessarily a win.

Hell I could imagine cases where CPU limited games demonstrate an advantage on XB1.

Sent from my SAMSUNG-SGH-I337 using Tapatalk 2

ANY Properly programmed AAA title will be FILL RATE CHALLENGED and not Vertex. More CU's equate to More Fill Rate. Getting to 1080p is about fill rate. Getting to 60fps @ 1080p is about fillrate.
50% more CU is 50% more fill rate.
Fill Rate is king, and Anisotropic Filtering before Anti Aliasing at higher resolutions. http://en.wikipedia.org/wiki/Anisotropic_filtering

vertex is very important, but, setting all things the same... Frame Rates and Resolution are about Fill Rate, and Fill Rate scales linearly with CU's.

they aren't... just this gen they are sitting back and watching MS do it.. last gen was ALL Sony blabbing on about the "cell"...

Just different years, different PR people... business as usual.

Indeed that is the case. Microsoft and Sony have been, currently are, and will continue to pull the same antics, same PR, and same mistakes of a different tune.

Originally Posted by victorijapoosp

Except the Cell eventually lived up to it's promise thanks to the research and resulting software from Sony Santa monica, Naughty Dog, Cerny's ICE team, Guerilla Games and the European R and D team based in London and SCEJ.

The cell worked, but it didn't produce anything better than the competing console. We are ending this gen with visuals and physics being identical.

It's funny how penello was pushing "we invented DIRECT X!" Sony just have to respond with Naughty Dog who deciphered the CELL, and then shared their tech in the SDK for third parties to use. Naughty Dog, being Cerny's home turf most likely had a massive input on the system's architecture and SDK environment. I wonder how far they will push the PS4, the GPGPU utilisation may well change the gaming industry, because that tech will be shared with 3rd parties and indeed the xbone and PC games will benefit because tech engineers from GG, ND, SSM, ICE team etc will be deciphering the same architecture.

I suspect Cerny is being forthcoming and honest about how his company has developed the PS4, just like Penello is being forthcoming and honest about his company has developed the X1.

What they have achieved on the CELL is remarkable. God of war, Gran Turismo, Uncharted 1, 2, 3 and the Last of Us are stunning and far exceed anything I have seen on an Xbox 360.

I have yet to see anything exceed Battlefield 3 on console. Though Last of Us is right up there with Halo 4.

Most of the people i know who bought the 360 first last gen, only chose Xb360 because it was cheaper at the time for the basic unit. Then after the RROD fiasco and the cost of propriety 360 wifi and hard drives from Microsoft necessary to play most games, they believe they ended up paying a lot more. And because of the ever diminishing exclusive 360 games after 2010, have decided this gen they will go with PlayStation first because they now there will be a large number of varied quality exclusives every year, and MAYBE the xbone after a 3 or 4 years when the price drops and there is a decent library.

Most people bought the Xbox 360 because of it's game library, which is vastly larger. I don't know anyone personally that bought a 360 because it was cheaper. Everyone I know that bought one was for the games and the fact that it had Xbox Live and all friends/relatives had the console, or the previous Xbox. That is why the software attach rate tops on the 360. Not to mention, their exclusive line up continued to grow in leaps and bounds through now.

Lmao grasping for straws can't handle truth. He got his info from MS tech engineers. Its funny how things work on this forum whatever Sony says it is true but when Microsoft explains the tech and specs behind the x1 its all lies.

-Adding to that, each of our CU's is running 6% faster. It's not simply a 6% clock speed increase overall.

-We have more memory bandwidth. 176gb/sec is peak on paper for GDDR5. Our peak on paper is 272gb/sec. (68gb/sec DDR3 + 204gb/sec on ESRAM). ESRAM can do read/write cycles simultaneously so I see this number mis-quoted.

-We have at least 10% more CPU. Not only a faster processor, but a better audio chip also offloading CPU cycles.

-We understand GPGPU and its importance very well. Microsoft invented Direct Compute, and have been using GPGPU in a shipping product since 2010 - it's called Kinect.

-Speaking of GPGPU - we have 3X the coherent bandwidth for GPGPU at 30gb/sec which significantly improves our ability for the CPU to efficiently read data generated by the GPU.

-Adding to that, each of our CU's is running 6% faster. It's not simply a 6% clock speed increase overall.

-We have more memory bandwidth. 176gb/sec is peak on paper for GDDR5. Our peak on paper is 272gb/sec. (68gb/sec DDR3 + 204gb/sec on ESRAM). ESRAM can do read/write cycles simultaneously so I see this number mis-quoted.

-We have at least 10% more CPU. Not only a faster processor, but a better audio chip also offloading CPU cycles.

-We understand GPGPU and its importance very well. Microsoft invented Direct Compute, and have been using GPGPU in a shipping product since 2010 - it's called Kinect.

-Speaking of GPGPU - we have 3X the coherent bandwidth for GPGPU at 30gb/sec which significantly improves our ability for the CPU to efficiently read data generated by the GPU.

And that is where it comes to. this is speaking on behalf of the ESRAM and this magical 272gb/s which is so far and away ridiculous. deal with it soldier. it's a slacked number that they pulled out of their asses. 272gb/s ROFL! You by any chance didn't get that on the IGN forums which I know you go to? But I also know that the topic for that came up in there. It's a ridiculous amount so much so that I'm quite surprised it's being posted on here now.

A lot of the things being said as lies are lies because it's not MS stating it, it's people in IGN saying "oh look, now it's at 300gb/s and I know exactly who is posting that rubbish in there. Come on, you should know better than that. damn

As for the processor being faster... where are the CPU specs of the PS4 processor in terms of speed, etc etc? Saying it's faster is again a lie because the actual speed of the CPU for the PS4 wasn't even revealed. It's assumptions and that's all it is. So if this is what MS is saying "which I doubt" then I can see why people are calling MS out on their bull$#@!.

I don't think that was stated by MS and if it was, $#@!in a man, they are in the $#@! and it's because they are lying. This is the type of stuff that is giving MS a really bad name. It's not so much the 180 but the people leaning to Xbox One and lying for MS is making them look bad.

And that is where it comes to. this is speaking on behalf of the ESRAM and this magical 272gb/s which is so far and away ridiculous. deal with it soldier. it's a slacked number that they pulled out of their asses. 272gb/s ROFL! You by any chance didn't get that on the IGN forums which I know you go to? But I also know that the topic for that came up in there. It's a ridiculous amount so much so that I'm quite surprised it's being posted on here now.

A lot of the things being said as lies are lies because it's not MS stating it, it's people in IGN saying "oh look, now it's at 300gb/s and I know exactly who is posting that rubbish in there. Come on, you should know better than that. damn

As for the processor being faster... where are the CPU specs of the PS4 processor in terms of speed, etc etc? Saying it's faster is again a lie because the actual speed of the CPU for the PS4 wasn't even revealed. It's assumptions and that's all it is. So if this is what MS is saying "which I doubt" then I can see why people are calling MS out on their bull$#@!.

I don't think that was stated by MS and if it was, $#@!in a man, they are in the $#@! and it's because they are lying. This is the type of stuff that is giving MS a really bad name. It's not so much the 180 but the people leaning to Xbox One and lying for MS is making them look bad.

don't be that guy.

Yup MS lies about their own hardware. Sony is honest about their hardware. We got it. It's official thanks to you. Now we can all sleep better at night. We all thank you!

Yup MS lies about their own hardware. Sony is honest about their hardware. We got it. It's official thanks to you. Now we can all sleep better at night. We all thank you!

You quoted my post and yet you didn't read it.

And that is where it comes to. this is speaking on behalf of the ESRAM and this magical 272gb/s which is so far and away ridiculous. deal with it soldier. it's a slacked number that they pulled out of their asses. 272gb/s ROFL! You by any chance didn't get that on the IGN forums which I know you go to? But I also know that the topic for that came up in there. It's a ridiculous amount so much so that I'm quite surprised it's being posted on here now.

A lot of the things being said as lies are lies because it's not MS stating it, it's people in IGN saying "oh look, now it's at 300gb/s and I know exactly who is posting that rubbish in there. Come on, you should know better than that. damn

As for the processor being faster... where are the CPU specs of the PS4 processor in terms of speed, etc etc? Saying it's faster is again a lie because the actual speed of the CPU for the PS4 wasn't even revealed. It's assumptions and that's all it is. So if this is what MS is saying "which I doubt" then I can see why people are calling MS out on their bull$#@!.

I don't think that was stated by MS and if it was, $#@!in a man, they are in the $#@! and it's because they are lying. This is the type of stuff that is giving MS a really bad name. It's not so much the 180 but the people leaning to Xbox One and lying for MS is making them look bad.

don't be that guy.

No where did I say that MS was saying it. I did say IF they were then they are in the $#@!. Completely different and it's not the same as pointing the finger.

ANY Properly programmed AAA title will be FILL RATE CHALLENGED and not Vertex. More CU's equate to More Fill Rate. Getting to 1080p is about fill rate. Getting to 60fps @ 1080p is about fillrate.
50% more CU is 50% more fill rate.
Fill Rate is king, and Anisotropic Filtering before Anti Aliasing at higher resolutions. http://en.wikipedia.org/wiki/Anisotropic_filtering

vertex is very important, but, setting all things the same... Frame Rates and Resolution are about Fill Rate, and Fill Rate scales linearly with CU's.

hg
So, with the megahertz increase, the PS4 GPU is now 0.406799531066822977725674091442 more powerful than the GPU on the XBO and we still don't know if the cpu on the PS4 is 1.6ghz now or if it will be higher...
but, the XBO Jaguars are currently 0.09375 faster than the cores in the PS3.
NO, you can't subtract one the other to get how much faster the PS4 is, this isn't MICROSOFT MATH.

If SONY's yields are so good and they push the Jagaurs to 2ghz, then, MS can kiss off.
Add in the bandwidth, which is LOWLATENCY GDDR5 (It was higher latency YEARS AGO, it's a mature tight memory tech now), ...

5500/2133 =2.5785278949835911861228316924519
PS4 main memory is 1.5785278949835911861228316924519 times faster (158% faster or 2.58 the speed)
XBO has a SMALL mildly fast 32GB Scratch Pad, which, again, i think is just for executing physics items faster, because that does NOT hold much memory for textures. Color me continually unpressed.

Yup MS lies about their own hardware. Sony is honest about their hardware. We got it. It's official thanks to you. Now we can all sleep better at night. We all thank you!

It would be silly to accept on face value anything coming from anyone trying to sell you something, whether it's a statement from sony or a guy who knocked on your door trying to sell you a vacuum cleaner.
That said, microsoft in particular are well known to spread FUD and outright lie through their teeth. These are the people who claim windows is more secure than linux because vulnerabilities in linux are fixed more often. Their logic being linux announces they've fixed 50 of 60 issues, we've announced we've fixed 5 but don't mention the 100's or 1000's we haven't therefore we're more secure because hey, 60 vs 5 do the maths.

Caveat: I won't deny that I dislike microsoft but it has nothing to do with consoles or fanboyism, it's to do with knowing a little of their history.

No where did I say that MS was saying it. I did say IF they were then they are in the $#@!. Completely different and it's not the same as pointing the finger.

I guess I misread your comment or I am confused. If so, my bad.

Originally Posted by lord flasheart

It would be silly to accept on face value anything coming from anyone trying to sell you something, whether it's a statement from sony or a guy who knocked on your door trying to sell you a vacuum cleaner.

Agreed.

That said, microsoft in particular are well known to spread FUD and outright lie through their teeth.

Agreed here as well. But Sony is also well known for this. Ain't no thing though. I still want their consoles.